373 resultados para size-fecundity variation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the findings of an analysis of the activities of rural nurses from a national audit of the role and function of the rural nurse (Hegney, Pearson and McCarthy 1997). The results suggest that the size of the health service (defined by the number of acute beds) influences the activities of rural nurses. Further, the study reports on the differences of the context of practice between different size rural health services and the impact this has on the scope of rural nursing practice. The paper will conclude that the size of the health service is an outcome of rurality (small population densities, distance from larger health facilities, lack of on-site medical and allied health staff). It also notes that the size of the health service is a major contextual determinant of patient acuity and staff skill-mix in small rural hospitals, and therefore the scope of rural nursing practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses one of the foundational components of beginning infernce, namely variation, with 5 classes of Year 4 students undertaking a measurement activity using scaled instruments in two contexts: all students measuring one person's arm span and recording the values obtained, and each student having his/her own arm span measured and recorded. The results included documentation of students' explicit appreciation of the variety of ways in which varitation can occur, including outliers, and their ability to create and describe valid representations of their data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Custom designed for display on the Cube Installation situated in the new Science and Engineering Centre (SEC) at QUT, the ECOS project is a playful interface that uses real-time weather data to simulate how a five-star energy building operates in climates all over the world. In collaboration with the SEC building managers, the ECOS Project incorporates energy consumption and generation data of the building into an interactive simulation, which is both engaging to users and highly informative, and which invites play and reflection on the roles of green buildings. ECOS focuses on the principle that humans can have both a positive and negative impact on ecosystems with both local and global consequence. The ECOS project draws on the practice of Eco-Visualisation, a term used to encapsulate the important merging of environmental data visualization with the philosophy of sustainability. Holmes (2007) uses the term Eco-Visualisation (EV) to refer to data visualisations that ‘display the real time consumption statistics of key environmental resources for the goal of promoting ecological literacy’. EVs are commonly artifacts of interaction design, information design, interface design and industrial design, but are informed by various intellectual disciplines that have shared interests in sustainability. As a result of surveying a number of projects, Pierce, Odom and Blevis (2008) outline strategies for designing and evaluating effective EVs, including ‘connecting behavior to material impacts of consumption, encouraging playful engagement and exploration with energy, raising public awareness and facilitating discussion, and stimulating critical reflection.’ Consequently, Froehlich (2010) and his colleagues also use the term ‘Eco-feedback technology’ to describe the same field. ‘Green IT’ is another variation which Tomlinson (2010) describes as a ‘field at the juncture of two trends… the growing concern over environmental issues’ and ‘the use of digital tools and techniques for manipulating information.’ The ECOS Project team is guided by these principles, but more importantly, propose an example for how these principles may be achieved. The ECOS Project presents a simplified interface to the very complex domain of thermodynamic and climate modeling. From a mathematical perspective, the simulation can be divided into two models, which interact and compete for balance – the comfort of ECOS’ virtual denizens and the ecological and environmental health of the virtual world. The comfort model is based on the study of psychometrics, and specifically those relating to human comfort. This provides baseline micro-climatic values for what constitutes a comfortable working environment within the QUT SEC buildings. The difference between the ambient outside temperature (as determined by polling the Google Weather API for live weather data) and the internal thermostat of the building (as set by the user) allows us to estimate the energy required to either heat or cool the building. Once the energy requirements can be ascertained, this is then balanced with the ability of the building to produce enough power from green energy sources (solar, wind and gas) to cover its energy requirements. Calculating the relative amount of energy produced by wind and solar can be done by, in the case of solar for example, considering the size of panel and the amount of solar radiation it is receiving at any given time, which in turn can be estimated based on the temperature and conditions returned by the live weather API. Some of these variables can be altered by the user, allowing them to attempt to optimize the health of the building. The variables that can be changed are the budget allocated to green energy sources such as the Solar Panels, Wind Generator and the Air conditioning to control the internal building temperature. These variables influence the energy input and output variables, modeled on the real energy usage statistics drawn from the SEC data provided by the building managers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research was a qualitative study that explored the experience of health information literacy. It used a research approach that emphasised identifying and describing variation in experience to investigate people's experience of using information to learn about health, and what they experienced as information for learning about health. The study's findings identified seven categories that represented qualitatively different ways in which people experienced health information literacy, and provide new knowledge about people's engagement with health information for learning in everyday life. The study contributes to consumer health information research and is significant to the disciplines of health and information science.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The central thesis in the article is that the venture creation process is different for innovative versus imitative ventures. This holds up; the pace of the process differs by type of venture as do, in line with theory-based hypotheses, the effects of certain human capital (HC) and social capital (SC) predictors. Importantly, and somewhat unexpectedly, the theoretically derived models using HC, SC, and certain controls are relatively successful explaining progress in the creation process for the minority of innovative ventures, but achieve very limited success for the imitative majority. This may be due to a rationalistic bias in conventional theorizing and suggests that there is need for considerable theoretical development regarding the important phenomenon of new venture creation processes. Another important result is that the building up of instrumental social capital, which we assess comprehensively and as a time variant construct, is important for making progress with both types of ventures, and increasingly, so as the process progresses. This result corroborates with stronger operationalization and more appropriate analysis method what previously published research has only been able to hint at.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter describes an innovative method of curriculum design that is based on combining phenomenographic research, and the associated variation theory of learning, with the notion of disciplinary threshold concepts to focus specialised design attention on the most significant and difficult parts of the curriculum. The method involves three primary stages: (i) identification of disciplinary concepts worthy of intensive curriculum design attention, using the criteria for threshold concepts; (ii) action research into variation in students’ understandings/misunderstandings of those concepts, using phenomenography as the research approach; (iii) design of learning activities to address the poorer understandings identified in the second stage, using variation theory as a guiding framework. The curriculum design method is inherently theory and evidence based. It was developed and trialed during a two-year project funded by the Australian Learning and Teaching Council, using physics and law disciplines as case studies. Disciplinary teachers’ perceptions of the impact of the method on their teaching and understanding of student learning were profound. Attempts to measure the impact on student learning were less conclusive; teachers often unintentionally deviated from the design when putting it into practice for the first time. Suggestions for improved implementation of the method are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Discretization of a geographical region is quite common in spatial analysis. There have been few studies into the impact of different geographical scales on the outcome of spatial models for different spatial patterns. This study aims to investigate the impact of spatial scales and spatial smoothing on the outcomes of modelling spatial point-based data. Given a spatial point-based dataset (such as occurrence of a disease), we study the geographical variation of residual disease risk using regular grid cells. The individual disease risk is modelled using a logistic model with the inclusion of spatially unstructured and/or spatially structured random effects. Three spatial smoothness priors for the spatially structured component are employed in modelling, namely an intrinsic Gaussian Markov random field, a second-order random walk on a lattice, and a Gaussian field with Matern correlation function. We investigate how changes in grid cell size affect model outcomes under different spatial structures and different smoothness priors for the spatial component. A realistic example (the Humberside data) is analyzed and a simulation study is described. Bayesian computation is carried out using an integrated nested Laplace approximation. The results suggest that the performance and predictive capacity of the spatial models improve as the grid cell size decreases for certain spatial structures. It also appears that different spatial smoothness priors should be applied for different patterns of point data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The size of the carrier influences the aerosolization of drug from a dry powder inhaler (DPI) formulation. Currently, lactose monohydrate particles in a variety of sizes are preferably used in carrier based DPI formulations of various drugs; however, contradictory reports exist regarding the effect of the size of the carrier on the dispersion of drug. In this study we examined the influence of the intrinsic particle size of the polymeric carrier on the aerosolization of a model drug salbutamol sulphate (SS). Methods Four different sizes (20–150 lm) of polymer carriers were fabricated using solvent evaporation technique and the dispersion of SS particles from these carriers was measured by a Twin Stage Impinger (TSI). The size and morphological properties of polymer carriers were by laser diffraction and SEM, respectively. Results The FPF from these carriers was found to be increasing from 5.6% to 21.3% with increasing the carrier size. The FPF was found to be greater (21%) with the highest particle size of the carrier (150 lm). Conclusions The aerosolization of drug was dependent on the size of polymer carriers. The smaller size of the carrier resulted in lower FPF which was increased with increasing the carrier size. For a fixed mass of drug particles in a formulation, the mass of drug particles per unit area of carriers is higher in formulations containing the larger carriers, which leads to an increase in the dispersion of drug due to the increased mechanical forces occurred between the carriers and the device walls.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Floods are among the most devastating events that affect primarily tropical, archipelagic countries such as the Philippines. With the current predictions of climate change set to include rising sea levels, intensification of typhoon strength and a general increase in the mean annual precipitation throughout the Philippines, it has become paramount to prepare for the future so that the increased risk of floods on the country does not translate into more economic and human loss. Field work and data gathering was done within the framework of an internship at the former German Technical Cooperation (GTZ) in cooperation with the Local Government Unit of Ormoc City, Leyte, The Philippines, in order to develop a dynamic computer based flood model for the basin of the Pagsangaan River. To this end, different geo-spatial analysis tools such as PCRaster and ArcGIS, hydrological analysis packages and basic engineering techniques were assessed and implemented. The aim was to develop a dynamic flood model and use the development process to determine the required data, availability and impact on the results as case study for flood early warning systems in the Philippines. The hope is that such projects can help to reduce flood risk by including the results of worst case scenario analyses and current climate change predictions into city planning for municipal development, monitoring strategies and early warning systems. The project was developed using a 1D-2D coupled model in SOBEK (Deltares Hydrological modelling software package) and was also used as a case study to analyze and understand the influence of different factors such as land use, schematization, time step size and tidal variation on the flood characteristics. Several sources of relevant satellite data were compared, such as Digital Elevation Models (DEMs) from ASTER and SRTM data, as well as satellite rainfall data from the GIOVANNI server (NASA) and field gauge data. Different methods were used in the attempt to partially calibrate and validate the model to finally simulate and study two Climate Change scenarios based on scenario A1B predictions. It was observed that large areas currently considered not prone to floods will become low flood risk (0.1-1 m water depth). Furthermore, larger sections of the floodplains upstream of the Lilo- an’s Bridge will become moderate flood risk areas (1 - 2 m water depth). The flood hazard maps created for the development of the present project will be presented to the LGU and the model will be used to create a larger set of possible flood prone areas related to rainfall intensity by GTZ’s Local Disaster Risk Management Department and to study possible improvements to the current early warning system and monitoring of the basin section belonging to Ormoc City; recommendations about further enhancement of the geo-hydro-meteorological data to improve the model’s accuracy mainly on areas of interest will also be presented at the LGU.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We explored the impact of neighborhood walkability on young adults, early-middle adults, middle-aged adults, and older adults' walking across different neighborhood buffers. Participants completed the Western Australian Health and Wellbeing Surveillance System Survey (2003–2009) and were allocated a neighborhood walkability score at 200 m, 400 m, 800 m, and 1600 m around their home. We found little difference in strength of associations across neighborhood size buffers for all life stages. We conclude that neighborhood walkability supports more walking regardless of adult life stage and is relevant for small (e.g., 200 m) and larger (e.g., 1600 m) neighborhood buffers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigates the variability in response of optically stimulated luminescence dosimeters (OSLDs). Examining the source of sensitivity variations in these dosimeters allows for a more comprehensive understanding of the Landauer nanoDots and their potential for current and future applications. In this work, OSLDs were scanned with a MicroCT scanner to determine potential sources for the variation in relative sensitivity across a selection of Landauer nanoDot dosimeters. Specifically, the correlation between a dosimeters relative sensitivity and the loading density of Al2O3:C powder was determined. When extrapolating the sensitive volume's radiodensity from the CT data, it was shown that there is a non-uniform distribution in crystal growth. It was calculated that a 0.05% change in the nominal volume of the chip produces a 1% change in the overall response. Additionally, the ‘true’ volume of an OSLD's sensitive material is, on average, 18% less than that which has been reported in literature, mainly due to the presence of air cavities in the material's structure. This work demonstrated that the amount of sensitive material is approximately linked to the total correction factor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose This work introduces the concept of very small field size. Output factor (OPF) measurements at these field sizes require extremely careful experimental methodology including the measurement of dosimetric field size at the same time as each OPF measurement. Two quantifiable scientific definitions of the threshold of very small field size are presented. Methods A practical definition was established by quantifying the effect that a 1 mm error in field size or detector position had on OPFs, and setting acceptable uncertainties on OPF at 1%. Alternatively, for a theoretical definition of very small field size, the OPFs were separated into additional factors to investigate the specific effects of lateral electronic disequilibrium, photon scatter in the phantom and source occlusion. The dominant effect was established and formed the basis of a theoretical definition of very small fields. Each factor was obtained using Monte Carlo simulations of a Varian iX linear accelerator for various square field sizes of side length from 4 mm to 100 mm, using a nominal photon energy of 6 MV. Results According to the practical definition established in this project, field sizes < 15 mm were considered to be very small for 6 MV beams for maximal field size uncertainties of 1 mm. If the acceptable uncertainty in the OPF was increased from 1.0 % to 2.0 %, or field size uncertainties are 0.5 mm, field sizes < 12 mm were considered to be very small. Lateral electronic disequilibrium in the phantom was the dominant cause of change in OPF at very small field sizes. Thus the theoretical definition of very small field size coincided to the field size at which lateral electronic disequilibrium clearly caused a greater change in OPF than any other effects. This was found to occur at field sizes < 12 mm. Source occlusion also caused a large change in OPF for field sizes < 8 mm. Based on the results of this study, field sizes < 12 mm were considered to be theoretically very small for 6 MV beams. Conclusions Extremely careful experimental methodology including the measurement of dosimetric field size at the same time as output factor measurement for each field size setting and also very precise detector alignment is required at field sizes at least < 12 mm and more conservatively < 15 mm for 6 MV beams. These recommendations should be applied in addition to all the usual considerations for small field dosimetry, including careful detector selection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stereotactic radiosurgery treatments involve the delivery of very high doses for a small number of fractions. To date, there is limited data in terms of the skin dose for the very small field sizes used in these treatments. In this work, we determine relative surface doses for small size circular collimators as used in stereotactic radiosurgery treatments. Monte Carlo calculations were performed using the BEAMnrc code with a model of the Novalis 15 Trilogy linear accelerator and the BrainLab circular collimators. The surface doses were calculated at the ICRU skin dose depth of 70 m all using the 6 MV SRS x-ray beam. The calculated surface doses varied between 15 – 12% with decreasing values as the field size increased from 4 to 30 mm. In comparison, surface doses were measured using Gafchromic EBT3 film positioned at the surface of a Virtual Water phantom. The absolute agreement between calculated and measured surface doses was better than 2.5% which is well within the 20 uncertainties of the Monte Carlo calculations and the film measurements. Based on these results, we have shown that the Gafchromic EBT3 film is suitable for surface dose estimates in very small size fields as used in SRS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project develops and evaluates a model of curriculum design that aims to assist student learning of foundational disciplinary ‘Threshold Concepts’. The project uses phenomenographic action research, cross-institutional peer collaboration and the Variation Theory of Learning to develop and trial the model. Two contrasting disciplines (Physics and Law) and four institutions (two research-intensive and two universities of technology) were involved in the project, to ensure broad applicability of the model across different disciplines and contexts. The Threshold Concepts that were selected for curriculum design attention were measurement uncertainty in Physics and legal reasoning in Law. Threshold Concepts are key disciplinary concepts that are inherently troublesome, transformative and integrative in nature. Once understood, such concepts transform students’ views of the discipline because they enable students to coherently integrate what were previously seen as unrelated aspects of the subject, providing new ways of thinking about it (Meyer & Land 2003, 2005, 2006; Land et al. 2008). However, the integrative and transformative nature of such threshold concepts make them inherently difficult for students to learn, with resulting misunderstandings of concepts being prevalent...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mycobacterium kansasii is a pulmonary pathogen that has been grown readily from municipal water, but rarely isolated from natural waters. A definitive link between water exposure and disease has not been demonstrated and the environmental niche for this organism is poorly understood. Strain typing of clinical isolates has revealed seven subtypes with Type 1 being highly clonal and responsible for most infections worldwide. The prevalence of other subtypes varies geographically. In this study 49 water isolates are compared with 72 patient isolates from the same geographical area (Brisbane, Australia), using automated repetitive unit PCR (Diversilab) and ITS RFLP. The clonality of the dominant clinical strain type is again demonstrated but with rep-PCR, strain variation within this group is evident comparable with other reported methods. There is significant heterogeneity of water isolates and very few are similar or related to the clinical isolates. This suggests that if water or aerosol transmission is the mode of infection, then point source contamination likely occurs from an alternative environmental source.