567 resultados para Number field


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The basic reproduction number of a pathogen, R 0, determines whether a pathogen will spread (R0>1R 0>1), when introduced into a fully susceptible population or fade out (R0<1R 0<1), because infected hosts do not, on average, replace themselves. In this paper we develop a simple mechanistic model for the basic reproduction number for a group of tick-borne pathogens that wholly, or almost wholly, depend on horizontal transmission to and from vertebrate hosts. This group includes the causative agent of Lyme disease, Borrelia burgdorferi, and the causative agent of human babesiosis, Babesia microti, for which transmission between co-feeding ticks and vertical transmission from adult female ticks are both negligible. The model has only 19 parameters, all of which have a clear biological interpretation and can be estimated from laboratory or field data. The model takes into account the transmission efficiency from the vertebrate host as a function of the days since infection, in part because of the potential for this dynamic to interact with tick phenology, which is also included in the model. This sets the model apart from previous, similar models for R0 for tick-borne pathogens. We then define parameter ranges for the 19 parameters using estimates from the literature, as well as laboratory and field data, and perform a global sensitivity analysis of the model. This enables us to rank the importance of the parameters in terms of their contribution to the observed variation in R0. We conclude that the transmission efficiency from the vertebrate host to Ixodes scapularis ticks, the survival rate of Ixodes scapularis from fed larva to feeding nymph, and the fraction of nymphs finding a competent host, are the most influential factors for R0. This contrasts with other vector borne pathogens where it is usually the abundance of the vector or host, or the vector-to-host ratio, that determine conditions for emergence. These results are a step towards a better understanding of the geographical expansion of currently emerging horizontally transmitted tick-borne pathogens such as Babesia microti, as well as providing a firmer scientific basis for targeted use of acaricide or the application of wildlife vaccines that are currently in development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gold particle interaction with few-layer graphenes is of interest for the development of numerous optical nanodevices. The results of numerical studies of the coupling of gold nanoparticles with few-layer vertical graphene sheets are presented. The field strengths are computed and the optimum nanoparticle configurations for the formation of SERS hotpots are obtained. The nanoparticles are modeled as 8 nm diameter spheres atop 1.5 nm (5 layers) graphene sheet. The vertical orientation is of particular interest as it is possible to use both sides of the graphene structure and potentially double the number of particles in the system. Our results show that with the addition of an opposing particle a much stronger signal can be obtained as well as the particle separation can be controlled by the number of atomic carbon layers. These results provide further insights and contribute to the development of next-generation plasmonic devices based on nanostructures with hybrid dimensionality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measurement of the moisture variation in soils is required for geotechnical design and research because soil properties and behavior can vary as moisture content changes. The neutron probe, which was developed more than 40 years ago, is commonly used to monitor soil moisture variation in the field. This study reports a full-scale field monitoring of soil moisture using a neutron moisture probe for a period of more than 2 years in the Melbourne (Australia) region. On the basis of soil types available in the Melbourne region, 23 sites were chosen for moisture monitoring down to a depth of 1500 mm. The field calibration method was used to develop correlations relating the volumetric moisture content and neutron counts. Observed results showed that the deepest “wetting front” during the wet season was limited to the top 800 to 1000 mm of soil whilst the top soil layer down to about 550mmresponded almost immediately to the rainfall events. At greater depths (550 to 800mmand below 800 mm), the moisture variations were relatively low and displayed predominantly periodic fluctuations. This periodic nature was captured with Fourier analysis to develop a cyclic moisture model on the basis of an analytical solution of a one-dimensional moisture flow equation for homogeneous soils. It is argued that the model developed can be used to predict the soil moisture variations as applicable to buried structures such as pipes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With increasing signs of climate change and the influence of national and international carbon-related laws and agreements, governments all over the world are grappling with how to rapidly transition to low-carbon living. This includes adapting to the impacts of climate change that are very likely to be experienced due to current emission levels (including extreme weather and sea level changes), and mitigating against further growth in greenhouse gas emissions that are likely to result in further impacts. Internationally, the concept of ‘Biophilic Urbanism’, a term coined by Professors Tim Beatley and Peter Newman to refer to the use of natural elements as design features in urban landscapes, is emerging as a key component in addressing such climate change challenges in rapidly growing urban contexts. However, the economics of incorporating such options is not well understood and requires further attention to underpin a mainstreaming of biophilic urbanism. Indeed, there appears to be an ad hoc, reactionary approach to creating economic arguments for or against the design, installation or maintenance of natural elements such as green walls, green roofs, streetscapes, and parklands. With this issue in mind, this paper will overview research as part of an industry collaborative research project that considers the potential for using a number of environmental economic valuation techniques that have evolved over the last several decades in agricultural and resource economics, to systematically value the economic value of biophilic elements in the urban context. Considering existing literature on environmental economic valuation techniques, the paper highlights opportunities for creating a standardised language for valuing biophilic elements. The conclusions have implications for expanding the field of environmental economic value to support the economic evaluations and planning of the greater use of natural elements in cities. Insights are also noted for the more mature fields of agricultural and resource economics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A pulsed wall jet has been used to simulate the gust front of a thunderstorm downburst. Flow visualization, wind speed and surface pressure measurements were obtained. The characteristics of the hypothesized ring vortex of a full-scale downburst were reproduced at a scale estimated to be 1:3000.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Literature from around the world clearly suggests that engineering education has been relatively slow to incorporate significant knowledge and skill areas, including the rapidly emerging area of sustainable development. Within this context, this paper presents the findings of research that questioned how engineering educators could consistently implement systematic and intentional curriculum renewal that is responsive to emerging engineering challenges and opportunities. The paper presents a number of elements of systematic and intentional curriculum renewal that have been empirically distilled from a qualitative multiple-method iterative research approach including literature review, narrative enquiry, pilot trials and peer-review workshops undertaken by the authors with engineering educators from around the world. The paper also presents new knowledge arising from the research, in the form of a new model that demonstrates a dynamic and deliberative mechanism for strategically accelerating for curriculum renewal efforts. Specifically the paper discusses implications of this model to achieve education for sustainable development, across all disciplines of engineering. It concludes with broader research and practice implications for the field of education research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High power, high frequency pulsed electric fields known as pulsed power (PP) has been applied recently in biology and medicine. However, little attention has been paid to investigate the application of pulse power in musculoskeletal system and its possible effect on functional behavior and biomechanical properties of bone tissue. This paper presents the first research investigating whether or not PP can be applied safely on bone tissue as a stimuli and what will be the possible effect of these signals on the characteristics of cortical bone by comparing the mechanical properties of this type of bone pre and post expose to PP and in comparison with the control samples. A positive buck‑boost converter was applied to generate adjustable high voltage, high frequency pulses (up to 500 V and 10 kHz). The functional behavior of bone in response to pulse power excitation was elucidated by applying compressive loading until failure. The stiffness, failure stress (strength) and the total fracture energy (bone toughness) were determined as a measure of the main bone characteristics. Furthermore, an ultrasonic technique was applied to determine and comprise bone elasticity before and after pulse power stimulation. The elastic property of cortical bone samples appeared to remain unchanged following exposure to pulse power excitation for all three orthogonal directions obtained from ultrasonic technique and similarly from the compression test. Nevertheless, the compressive strength and toughness of bone samples were increased when they were exposed to 66 h of high power pulsed electromagnetic field compared to the control samples. As the toughness and the strength of the cortical bone tissue are directly associated with the quality and integrity of the collagen matrix whereas its stiffness is primarily related to bone mineral content these overall results may address that although, the pulse power stimulation can influence the arrangement or the quality of the collagen network causing the bone strength and toughness augmentation, it apparently did not affect the mineral phase of the cortical bone material. The results also confirmed that the indirect application of high power pulsed electric field at 500 V and 10 kHz through capacitive coupling method was safe and did not destroy the bone tissue construction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analogy plays a central role in legal reasoning, yet how to analogize is poorly taught and poorly practiced. We all recognize when legal analogies are being made: when a law professor suggests a difficult hypothetical in class and a student tentatively guesses at the answer based on the cases she read the night before, when an attorney advises a client to settle because a previous case goes against him, or when a judge adopts one precedent over another on the basis that it better fits the present case. However, when it comes to explaining why certain analogies are compelling, persuasive, or better than the alternative, lawyers usually draw a blank. The purpose of this article is to provide a simple model that can be used to teach and to learn how analogy actually works, and what makes one analogy superior to a competing analogy. The model is drawn from a number of theories of analogy making in cognitive science. Cognitive science is the “long-term enterprise to understand the mind scientifically.” The field studies the mechanisms that are involved in cognitive processes like thinking, memory, learning, and recall; and one of its main foci has been on how people construct analogies. The lessons from cognitive science theories of analogy can be applied to legal analogies to give students and lawyers a better understanding of this fundamental process in legal reasoning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project was conducted at Lithgow Correctional Centre (LCC), NSW, Australia. Air quality field measurements were conducted on two occasions (23-27 May 2012, and 3-8 December 2012), just before and six months after the introduction of smoke free buildings policies (28 May 2012) at the LCC, respectively. The main aims of this project were to: (1) investigate the indoor air quality; (2) quantify the level of exposure to environmental tobacco smoke (ETS); (3) identify the main indoor particle sources; (4) distinguish between PM2.5 / particle number from ETS, as opposed to other sources; and (5) provide recommendations for improving indoor air quality and/or minimising exposure at the LCC. The measurements were conducted in Unit 5.2A, Unit 5.2B, Unit 1.1 and Unit 3.1, together with personal exposure measurements, based on the following parameters: -Indoor and outdoor particle number (PN) concentration in the size range 0.005-3 µm -Indoor and outdoor PM2.5 particle mass concentration -Indoor and outdoor VOC concentrations -Personal particle number exposure levels (in the size range 0.01-0.3 µm) -Indoor and outdoor CO and CO2 concentrations, temperature and relative humidity In order to enhance the outcomes of this project, the indoor and outdoor particle number (PN) concentrations were measured by two additional instruments (CPC 3787) which were not listed in the original proposal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ethnographic methods have been widely used for requirements elicitation purposes in systems design, especially when the focus is on understanding users? social, cultural and political contexts. Designing an on-line search engine for peer-reviewed papers could be a challenge considering the diversity of its end users coming from different educational and professional disciplines. This poster describes our exploration of academic research environments based on different in situ methods such as contextual interviews, diary-keeping, job-shadowing, etc. The data generated from these methods is analysed using a qualitative data analysis software and subsequently is used for developing personas that could be used as a requirements specification tool.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction Since 1992 there have been several articles published on research on plastic scintillators for use in radiotherapy. Plastic scintillators are said to be tissue equivalent, temperature independent and dose rate independent [1]. Although their properties were found to be promising for measurements in megavoltage X-ray beams there were some technical difficulties with regards to its commercialisation. Standard Imaging has produced the first commercial system which is now available for use in a clinical setting. The Exradin W1 scintillator device uses a dual fibre system where one fibre is connected to the Plastic Scintillator and the other fibre only measures Cerenkov radiation [2]. This paper presents results obtained during commissioning of this dosimeter system. Methods All tests were performed on a Novalis Tx linear accelerator equipped with a 6 MV SRS photon beam and conventional 6 and 18 MV X-ray beams. The following measurements were performed in a Virtual Water phantom at a depth of dose maximum. Linearity: The dose delivered was varied between 0.2 and 3.0 Gy for the same field conditions. Dose rate dependence: For this test the repetition rate of the linac was varied between 100 and 1,000 MU/min. A nominal dose of 1.0 Gy was delivered for each rate. Reproducibility: A total of five irradiations for the same setup. Results The W1 detector gave a highly linear relationship between dose and the number of Monitor Units delivered for a 10 9 10 cm2 field size at a SSD of 100 cm. The linearity was within 1 % for the high dose end and about 2 % for the very low dose end. For the dose rate dependence, the dose measured as a function of repetition the rate (100–1,000 MU/min) gave a maximum deviation of 0.9 %. The reproducibility was found to be better than 0.5 %. Discussion and conclusions The results for this system look promising so far being a new dosimetry system available for clinical use. However, further investigation is needed to produce a full characterisation prior to use in megavoltage X-ray beams.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction The dose to skin surface is an important factor for many radiotherapy treatment techniques. It is known that TPS predicted surface doses can be significantly different from actual ICRP skin doses as defined at 70 lm. A number of methods have been implemented for the accurate determination of surface dose including use of specific dosimeters such as TLDs and radiochromic film as well as Monte Carlo calculations. Stereotactic radiosurgery involves delivering very high doses per treatment fraction using small X-ray fields. To date, there has been limited data on surface doses for these very small field sizes. The purpose of this work is to evaluate surface doses by both measurements and Monte Carlo calculations for very small field sizes. Methods All measurements were performed on a Novalis Tx linear accelerator which has a 6 MV SRS X-ray beam mode which uses a specially thin flattening filter. Beam collimation was achieved by circular cones with apertures that gave field sizes ranging from 4 to 30 mm at the isocentre. The relative surface doses were measured using Gafchromic EBT3 film which has the active layer at a depth similar to the ICRP skin dose depth. Monte Carlo calculations were performed using the BEAMnrc/EGSnrc Monte Carlo codes (V4 r225). The specifications of the linear accelerator, including the collimator, were provided by the manufacturer. Optimisation of the incident X-ray beam was achieved by an iterative adjustment of the energy, spatial distribution and radial spread of the incident electron beam striking the target. The energy cutoff parameters were PCUT = 0.01 MeV and ECUT = 0.700 - MeV. Directional bremsstrahlung splitting was switched on for all BEAMnrc calculations. Relative surface doses were determined in a layer defined in a water phantom of the same thickness and depth as compared to the active later in the film. Results Measured surface doses using the EBT3 film varied between 13 and 16 % for the different cones with an uncertainty of 3 %. Monte Carlo calculated surface doses were in agreement to better than 2 % to the measured doses for all the treatment cones. Discussion and conclusions This work has shown the consistency of surface dose measurements using EBT3 film with Monte Carlo predicted values within the uncertainty of the measurements. As such, EBT3 film is recommended for in vivo surface dose measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction The consistency of measuring small field output factors is greatly increased by reporting the measured dosimetric field size of each factor, as opposed to simply stating the nominal field size [1] and therefore requires the measurement of cross-axis profiles in a water tank. However, this makes output factor measurements time consuming. This project establishes at which field size the accuracy of output factors are not affected by the use of potentially inaccurate nominal field sizes, which we believe establishes a practical working definition of a ‘small’ field. The physical components of the radiation beam that contribute to the rapid change in output factor at small field sizes are examined in detail. The physical interaction that dominates the cause of the rapid dose reduction is quantified, and leads to the establishment of a theoretical definition of a ‘small’ field. Methods Current recommendations suggest that radiation collimation systems and isocentre defining lasers should both be calibrated to permit a maximum positioning uncertainty of 1 mm [2]. The proposed practical definition for small field sizes is as follows: if the output factor changes by ±1.0 % given a change in either field size or detector position of up to ±1 mm then the field should be considered small. Monte Carlo modelling was used to simulate output factors of a 6 MV photon beam for square fields with side lengths from 4.0 to 20.0 mm in 1.0 mm increments. The dose was scored to a 0.5 mm wide and 2.0 mm deep cylindrical volume of water within a cubic water phantom, at a depth of 5 cm and SSD of 95 cm. The maximum difference due to a collimator error of ±1 mm was found by comparing the output factors of adjacent field sizes. The output factor simulations were repeated 1 mm off-axis to quantify the effect of detector misalignment. Further simulations separated the total output factor into collimator scatter factor and phantom scatter factor. The collimator scatter factor was further separated into primary source occlusion effects and ‘traditional’ effects (a combination of flattening filter and jaw scatter etc.). The phantom scatter was separated in photon scatter and electronic disequilibrium. Each of these factors was plotted as a function of field size in order to quantify how each affected the change in small field size. Results The use of our practical definition resulted in field sizes of 15 mm or less being characterised as ‘small’. The change in field size had a greater effect than that of detector misalignment. For field sizes of 12 mm or less, electronic disequilibrium was found to cause the largest change in dose to the central axis (d = 5 cm). Source occlusion also caused a large change in output factor for field sizes less than 8 mm. Discussion and conclusions The measurement of cross-axis profiles are only required for output factor measurements for field sizes of 15 mm or less (for a 6 MV beam on Varian iX linear accelerator). This is expected to be dependent on linear accelerator spot size and photon energy. While some electronic disequilibrium was shown to occur at field sizes as large as 30 mm (the ‘traditional’ definition of small field [3]), it has been shown that it does not cause a greater change than photon scatter until a field size of 12 mm, at which point it becomes by far the most dominant effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction Due to their high spatial resolution diodes are often used for small field relative output factor measurements. However, a field size specific correction factor [1] is required and corrects for diode detector over-response at small field sizes. A recent Monte Carlo based study has shown that it is possible to design a diode detector that produces measured relative output factors that are equivalent to those in water. This is accomplished by introducing an air gap at the upstream end of the diode [2]. The aim of this study was to physically construct this diode by placing an ‘air cap’ on the end of a commercially available diode (the PTW 60016 electron diode). The output factors subsequently measured with the new diode design were compared to current benchmark small field output factor measurements. Methods A water-tight ‘cap’ was constructed so that it could be placed over the upstream end of the diode. The cap was able to be offset from the end of the diode, thus creating an air gap. The air gap width was the same as the diode width (7 mm) and the thickness of the air gap could be varied. Output factor measurements were made using square field sizes of side length from 5 to 50 mm, using a 6 MV photon beam. The set of output factor measurements were repeated with the air gap thickness set to 0, 0.5, 1.0 and 1.5 mm. The optimal air gap thickness was found in a similar manner to that proposed by Charles et al. [2]. An IBA stereotactic field diode, corrected using Monte Carlo calculated kq,clin,kq,msr values [3] was used as the gold standard. Results The optimal air thickness required for the PTW 60016 electron diode was 1.0 mm. This was close to the Monte Carlo predicted value of 1.15 mm2. The sensitivity of the new diode design was independent of field size (kq,clin,kq,msr = 1.000 at all field sizes) to within 1 %. Discussion and conclusions The work of Charles et al. [2] has been proven experimentally. An existing commercial diode has been converted into a correction-less small field diode by the simple addition of an ‘air cap’. The method of applying a cap to create the new diode leads to the diode being dual purpose, as without the cap it is still an unmodified electron diode.