883 resultados para Plan Fines II


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical methodology was applied to a survey of time-course incidence of four viruses (alfalfa mosaic virus, clover yellow vein virus, subterranean clover mottle virus and subterranean clover red leaf virus) in improved pastures in southern regions of Australia. -from Authors

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sharing some closely related themes and a common theoretical orientation based on the governmentality analytic, these are nevertheless two very different contributions to criminological knowledge and theory. The first, The Currency of Justice: Fines and Damages in Consumer Societies (COJ), is a sustained and highly original analysis of that most pervasive yet overlooked feature of modern legal orders; their reliance on monetary sanctions. Crime and Risk (CAR), on the other hand, is a short synoptic overview of the many dimensions and trajectories of risk in contemporary debate and practice, both the practices of crime and the governance of crime. It is one of the first in a new series by Sage, 'Compact Criminology', in which authors survey in little more than a hundred pages some current field of debate. With this small gem, Pat O'Malley has set the bar very high for those who follow. For all its brevity, CAR traverses a massive expanse of research, debates and issues, while also opening up new and challenging questions around the politics of risk and the relationship between criminal risk-taking and the governance of risk and crime. The two books draw together various threads of O'Malley's rich body of work on these issues, and once again demonstrate that he is one of the foremost international scholars of risk inside and outside criminology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this age of rapidly evolving technology, teachers are encouraged to adopt ICTs by government, syllabus, school management, and parents. Indeed, it is an expectation that teachers will incorporate technologies into their classroom teaching practices to enhance the learning experiences and outcomes of their students. In particular, regarding the science classroom, a subject that traditionally incorporates hands-on experiments and practicals, the integration of modern technologies should be a major feature. Although myriad studies report on technologies that enhance students’ learning outcomes in science, there is a dearth of literature on how teachers go about selecting technologies for use in the science classroom. Teachers can feel ill prepared to assess the range of available choices and might feel pressured and somewhat overwhelmed by the avalanche of new developments thrust before them in marketing literature and teaching journals. The consequences of making bad decisions are costly in terms of money, time and teacher confidence. Additionally, no research to date has identified what technologies science teachers use on a regular basis, and whether some purchased technologies have proven to be too problematic, preventing their sustained use and possible wider adoption. The primary aim of this study was to provide research-based guidance to teachers to aid their decision-making in choosing technologies for the science classroom. The study unfolded in several phases. The first phase of the project involved survey and interview data from teachers in relation to the technologies they currently use in their science classrooms and the frequency of their use. These data were coded and analysed using Grounded Theory of Corbin and Strauss, and resulted in the development of a PETTaL model that captured the salient factors of the data. This model incorporated usability theory from the Human Computer Interaction literature, and education theory and models such as Mishra and Koehler’s (2006) TPACK model, where the grounded data indicated these issues. The PETTaL model identifies Power (school management, syllabus etc.), Environment (classroom / learning setting), Teacher (personal characteristics, experience, epistemology), Technology (usability, versatility etc.,) and Learners (academic ability, diversity, behaviour etc.,) as fields that can impact the use of technology in science classrooms. The PETTaL model was used to create a Predictive Evaluation Tool (PET): a tool designed to assist teachers in choosing technologies, particularly for science teaching and learning. The evolution of the PET was cyclical (employing agile development methodology), involving repeated testing with in-service and pre-service teachers at each iteration, and incorporating their comments i ii in subsequent versions. Once no new suggestions were forthcoming, the PET was tested with eight in-service teachers, and the results showed that the PET outcomes obtained by (experienced) teachers concurred with their instinctive evaluations. They felt the PET would be a valuable tool when considering new technology, and it would be particularly useful as a means of communicating perceived value between colleagues and between budget holders and requestors during the acquisition process. It is hoped that the PET could make the tacit knowledge acquired by experienced teachers about technology use in classrooms explicit to novice teachers. Additionally, the PET could be used as a research tool to discover a teachers’ professional development needs. Therefore, the outcomes of this study can aid a teacher in the process of selecting educationally productive and sustainable new technology for their science classrooms. This study has produced an instrument for assisting teachers in the decision-making process associated with the use of new technologies for the science classroom. The instrument is generic in that it can be applied to all subject areas. Further, this study has produced a powerful model that extends the TPACK model, which is currently extensively employed to assess teachers’ use of technology in the classroom. The PETTaL model grounded in data from this study, responds to the calls in the literature for TPACK’s further development. As a theoretical model, PETTaL has the potential to serve as a framework for the development of a teacher’s reflective practice (either self evaluation or critical evaluation of observed teaching practices). Additionally, PETTaL has the potential for aiding the formulation of a teacher’s personal professional development plan. It will be the basis for further studies in this field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study presents the largest-known, investigation on discomfort glare with 493 surveys collected from five green buildings in Brisbane, Australia. The study was conducted on full-time employees, working under their everyday lighting conditions, all of whom had no affiliation with the research institution. The survey consisted of a specially tailored questionnaire to assess potential factors relating to discomfort glare. Luminance maps extracted from high dynamic range (HDR) images were used to capture the luminous environment of the occupants. Occupants who experienced glare on their monitor and/or electric glare were excluded from analysis leaving 419 available surveys. Occupants were more sensitive to glare than any of the tested indices accounted for. A new index, the UGP was developed to take into account the scope of results in the investigation. The index is based on a linear transformation of the UGR to calculate a probability of disturbed persons. However all glare indices had some correlation to discomfort, and statistically there was no difference between the DGI, UGR and CGI. The UGP broadly reflects the demographics of the working population in Australia and the new index is applicable to open plan green buildings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the companion paper, a fourth-order element formulation in an updated Lagrangian formulation was presented to handle geometric non-linearities. The formulation of the present paper extends this to include material non-linearity by proposing a refined plastic hinge approach to analyse large steel framed structures with many members, for which contemporary algorithms based on the plastic zone approach can be problematic computationally. This concept is an advancement of conventional plastic hinge approaches, as the refined plastic hinge technique allows for gradual yielding, being recognized as distributed plasticity across the element section, a condition of full plasticity, as well as including strain hardening. It is founded on interaction yield surfaces specified analytically in terms of force resultants, and achieves accurate and rapid convergence for large frames for which geometric and material non-linearity are significant. The solutions are shown to be efficacious in terms of a balance of accuracy and computational expediency. In addition to the numerical efficiency, the present versatile approach is able to capture different kinds of material and geometric non-linearities on general applications of steel structures, and thereby it offers an efficacious and accurate means of assessing non-linear behaviour of the structures for engineering practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Level design is often characterised as “where the rubber hits the road” in game development. It is a core area of games design, alongside design of game rules and narrative. However, there is a lack of literature dedicated to documenting teaching games design, let alone the more specialised topic of level design. Furthermore, there is a lack of formal frameworks for best practice in level design, as professional game developers often rely on intuition and previous experience. As a result, there is little for games design teachers to draw on when presented with the opportunity to teach a level design unit. In this paper, we discuss the design and implementation of a games level design unit in which students use the StarCraft II Galaxy Editor. We report on two cycles of an action research project, reflecting upon our experiences with respect to student feedback and peer review, and outlining our plans for improving the unit in years to come.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study used automated data processing techniques to calculate a set of novel treatment plan accuracy metrics, and investigate their usefulness as predictors of quality assurance (QA) success and failure. 151 beams from 23 prostate and cranial IMRT treatment plans were used in this study. These plans had been evaluated before treatment using measurements with a diode array system. The TADA software suite was adapted to allow automatic batch calculation of several proposed plan accuracy metrics, including mean field area, small-aperture, off-axis and closed-leaf factors. All of these results were compared the gamma pass rates from the QA measurements and correlations were investigated. The mean field area factor provided a threshold field size (5 cm2, equivalent to a 2.2 x 2.2 cm2 square field), below which all beams failed the QA tests. The small aperture score provided a useful predictor of plan failure, when averaged over all beams, despite being weakly correlated with gamma pass rates for individual beams. By contrast, the closed leaf and off-axis factors provided information about the geometric arrangement of the beam segments but were not useful for distinguishing between plans that passed and failed QA. This study has provided some simple tests for plan accuracy, which may help minimise time spent on QA assessments of treatments that are unlikely to pass.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A natural single-crystal specimen of the kröhnkite from Chuquicamata, Chile, with the general formula Na2Cu(SO4)2 · 2H2O, was investigated by Raman and infrared spectroscopy. The mineral kröhnkite is found in many parts of the world's arid areas. Kröhnkite crystallizes in the monoclinic crystal system with point group 2/m and space group P21/c. It is an uncommon secondary mineral formed in the oxidized zone of copper deposits, typically in very arid climates. The Raman spectrum of kröhnkite dominated by a very sharp intense band at 992 cm−1 is assigned to the ν1 symmetric stretching mode and Raman bands at 1046, 1049, 1138, 1164, and 1177 cm−1 are assigned to the ν3 antisymmetric stretching vibrations. The infrared spectrum shows an intense band at 992 cm−1. The Raman bands at 569, 582, 612, 634, 642, 655, and 660 cm−1 are assigned to the ν4 bending modes. Three Raman bands observed at 429, 445, and 463 cm−1 are attributed to the ν2 bending modes. The observation that three or four bands are seen in the ν4 region of kröhnkite is attributed to the reduction of symmetry to C2v or less.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this chapter we continue the exposition of crypto topics that was begun in the previous chapter. This chapter covers secret sharing, threshold cryptography, signature schemes, and finally quantum key distribution and quantum cryptography. As in the previous chapter, we have focused only on the essentials of each topic. We have selected in the bibliography a list of representative items, which can be consulted for further details. First we give a synopsis of the topics that are discussed in this chapter. Secret sharing is concerned with the problem of how to distribute a secret among a group of participating individuals, or entities, so that only predesignated collections of individuals are able to recreate the secret by collectively combining the parts of the secret that were allocated to them. There are numerous applications of secret-sharing schemes in practice. One example of secret sharing occurs in banking. For instance, the combination to a vault may be distributed in such a way that only specified collections of employees can open the vault by pooling their portions of the combination. In this way the authority to initiate an action, e.g., the opening of a bank vault, is divided for the purposes of providing security and for added functionality, such as auditing, if required. Threshold cryptography is a relatively recently studied area of cryptography. It deals with situations where the authority to initiate or perform cryptographic operations is distributed among a group of individuals. Many of the standard operations of single-user cryptography have counterparts in threshold cryptography. Signature schemes deal with the problem of generating and verifying electronic) signatures for documents.Asubclass of signature schemes is concerned with the shared-generation and the sharedverification of signatures, where a collaborating group of individuals are required to perform these actions. A new paradigm of security has recently been introduced into cryptography with the emergence of the ideas of quantum key distribution and quantum cryptography. While classical cryptography employs various mathematical techniques to restrict eavesdroppers from learning the contents of encrypted messages, in quantum cryptography the information is protected by the laws of physics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study focused on simulating a trajectory point towards the end of the first experimental heatshield of the FIRE II vehicle, at a total flight time of 1639.53s. Scale replicas were sized according to binary scaling and instrumented with thermocouples for testing in the X1 expansion tube, located at The University of Queensland. Correlation of flight to experimental data was achieved through the separation, and independent treatment of the heat modes. Preliminary investigation indicates that the absolute value of radiant surface flux is conserved between two binary scaled models, whereas convective heat transfer increases with the length scale. This difference in the scaling techniques result in the overall contribution of radiative heat transfer diminishing to less than 1% in expansion tubes from a flight value of approximately 9-17%. From empirical correlation's it has been shown that the St √Re number decreases, under special circumstances, in expansion tubes by the percentage radiation present on the flight vehicle. Results obtained in this study give a strong indication that the relative radiative heat transfer contribution in the expansion tube tests is less than that in flight, supporting the analysis that the absolute value remains constant with binary scaling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The planning of IMRT treatments requires a compromise between dose conformity (complexity) and deliverability. This study investigates established and novel treatment complexity metrics for 122 IMRT beams from prostate treatment plans. The Treatment and Dose Assessor software was used to extract the necessary data from exported treatment plan files and calculate the metrics. For most of the metrics, there was strong overlap between the calculated values for plans that passed and failed their quality assurance (QA) tests. However, statistically significant variation between plans that passed and failed QA measurements was found for the established modulation index and for a novel metric describing the proportion of small apertures in each beam. The ‘small aperture score’ provided threshold values which successfully distinguished deliverable treatment plans from plans that did not pass QA, with a low false negative rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction This study aimed to examine the geometric and dosimetric results when radiotherapy treatment plans are designed for prostate cancer patients with hip prostheses. Methods Ten EBRT treatment plans for localised prostate cancer, in the presence of hip prostheses, were analysed and compared with a reference set of 196 treatment plans for localised prostate cancer in patients without prostheses. Crowe et al.’s TADA code [1] was used to extract treatment plan parameters and evaluate doses to target volumes and critical structures against recommended goals [2] and constraints [3, 4]. Results The need to avoid transmitting the radiation beam through the hip prostheses limited the range of gantry angles available for use in both the rotational (VMAT) and the non-rotational (3DCRT and IMRT) radiotherapy treatments. This geometric limitation (exemplified in the VMAT data shown in Fig. 1) reduced the overall quality of the treatment plans for patients with prostheses compared to the reference plans. All plans with prostheses failed the PTV dose homogeneity requirement [2], whereas only 4 % of the plans without prostheses failed this test. Several treatment plans for patients with hip prostheses also failed the QUANTEC requirements that less than 50 % of the rectum receive 50 Gy and less than 35 % of the rectum receive 60 Gy to keep the grade 3 toxicity rate below 10 % [3], or the Hansen and Roach requirement that less than 25 % of the bladder receive 75 Gy [4]. Discussion and conclusions The results of this study exemplify the difficulty of designing prostate radiotherapy treatment plans, where beams provide adequate doses to targeted tissues while avoiding nearby organs at risk, when the presence of hip prostheses limits the available treatment geometries. This work provides qualitative evidence of the compromised dose distributions that can result, in such cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a sequel to a paper that dealt with the analysis of two-way quantitative data in large germplasm collections, this paper presents analytical methods appropriate for two-way data matrices consisting of mixed data types, namely, ordered multicategory and quantitative data types. While various pattern analysis techniques have been identified as suitable for analysis of the mixed data types which occur in germplasm collections, the clustering and ordination methods used often can not deal explicitly with the computational consequences of large data sets (i.e. greater than 5000 accessions) with incomplete information. However, it is shown that the ordination technique of principal component analysis and the mixture maximum likelihood method of clustering can be employed to achieve such analyses. Germplasm evaluation data for 11436 accessions of groundnut (Arachis hypogaea L.) from the International Research Institute of the Semi-Arid Tropics, Andhra Pradesh, India were examined. Data for nine quantitative descriptors measured in the post-rainy season and five ordered multicategory descriptors were used. Pattern analysis results generally indicated that the accessions could be distinguished into four regions along the continuum of growth habit (or plant erectness). Interpretation of accession membership in these regions was found to be consistent with taxonomic information, such as subspecies. Each growth habit region contained accessions from three of the most common groundnut botanical varieties. This implies that within each of the habit types there is the full range of expression for the other descriptors used in the analysis. Using these types of insights, the patterns of variability in germplasm collections can provide scientists with valuable information for their plant improvement programs.