376 resultados para replicazione ottimistica, eventual consistency


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction The dose to skin surface is an important factor for many radiotherapy treatment techniques. It is known that TPS predicted surface doses can be significantly different from actual ICRP skin doses as defined at 70 lm. A number of methods have been implemented for the accurate determination of surface dose including use of specific dosimeters such as TLDs and radiochromic film as well as Monte Carlo calculations. Stereotactic radiosurgery involves delivering very high doses per treatment fraction using small X-ray fields. To date, there has been limited data on surface doses for these very small field sizes. The purpose of this work is to evaluate surface doses by both measurements and Monte Carlo calculations for very small field sizes. Methods All measurements were performed on a Novalis Tx linear accelerator which has a 6 MV SRS X-ray beam mode which uses a specially thin flattening filter. Beam collimation was achieved by circular cones with apertures that gave field sizes ranging from 4 to 30 mm at the isocentre. The relative surface doses were measured using Gafchromic EBT3 film which has the active layer at a depth similar to the ICRP skin dose depth. Monte Carlo calculations were performed using the BEAMnrc/EGSnrc Monte Carlo codes (V4 r225). The specifications of the linear accelerator, including the collimator, were provided by the manufacturer. Optimisation of the incident X-ray beam was achieved by an iterative adjustment of the energy, spatial distribution and radial spread of the incident electron beam striking the target. The energy cutoff parameters were PCUT = 0.01 MeV and ECUT = 0.700 - MeV. Directional bremsstrahlung splitting was switched on for all BEAMnrc calculations. Relative surface doses were determined in a layer defined in a water phantom of the same thickness and depth as compared to the active later in the film. Results Measured surface doses using the EBT3 film varied between 13 and 16 % for the different cones with an uncertainty of 3 %. Monte Carlo calculated surface doses were in agreement to better than 2 % to the measured doses for all the treatment cones. Discussion and conclusions This work has shown the consistency of surface dose measurements using EBT3 film with Monte Carlo predicted values within the uncertainty of the measurements. As such, EBT3 film is recommended for in vivo surface dose measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction The consistency of measuring small field output factors is greatly increased by reporting the measured dosimetric field size of each factor, as opposed to simply stating the nominal field size [1] and therefore requires the measurement of cross-axis profiles in a water tank. However, this makes output factor measurements time consuming. This project establishes at which field size the accuracy of output factors are not affected by the use of potentially inaccurate nominal field sizes, which we believe establishes a practical working definition of a ‘small’ field. The physical components of the radiation beam that contribute to the rapid change in output factor at small field sizes are examined in detail. The physical interaction that dominates the cause of the rapid dose reduction is quantified, and leads to the establishment of a theoretical definition of a ‘small’ field. Methods Current recommendations suggest that radiation collimation systems and isocentre defining lasers should both be calibrated to permit a maximum positioning uncertainty of 1 mm [2]. The proposed practical definition for small field sizes is as follows: if the output factor changes by ±1.0 % given a change in either field size or detector position of up to ±1 mm then the field should be considered small. Monte Carlo modelling was used to simulate output factors of a 6 MV photon beam for square fields with side lengths from 4.0 to 20.0 mm in 1.0 mm increments. The dose was scored to a 0.5 mm wide and 2.0 mm deep cylindrical volume of water within a cubic water phantom, at a depth of 5 cm and SSD of 95 cm. The maximum difference due to a collimator error of ±1 mm was found by comparing the output factors of adjacent field sizes. The output factor simulations were repeated 1 mm off-axis to quantify the effect of detector misalignment. Further simulations separated the total output factor into collimator scatter factor and phantom scatter factor. The collimator scatter factor was further separated into primary source occlusion effects and ‘traditional’ effects (a combination of flattening filter and jaw scatter etc.). The phantom scatter was separated in photon scatter and electronic disequilibrium. Each of these factors was plotted as a function of field size in order to quantify how each affected the change in small field size. Results The use of our practical definition resulted in field sizes of 15 mm or less being characterised as ‘small’. The change in field size had a greater effect than that of detector misalignment. For field sizes of 12 mm or less, electronic disequilibrium was found to cause the largest change in dose to the central axis (d = 5 cm). Source occlusion also caused a large change in output factor for field sizes less than 8 mm. Discussion and conclusions The measurement of cross-axis profiles are only required for output factor measurements for field sizes of 15 mm or less (for a 6 MV beam on Varian iX linear accelerator). This is expected to be dependent on linear accelerator spot size and photon energy. While some electronic disequilibrium was shown to occur at field sizes as large as 30 mm (the ‘traditional’ definition of small field [3]), it has been shown that it does not cause a greater change than photon scatter until a field size of 12 mm, at which point it becomes by far the most dominant effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim. This paper is a report of a development and validation of a new job performance scale based on an established job performance model. Background. Previous measures of nursing quality are atheoretical and fail to incorporate the complete range of behaviours performed. Thus, an up-to-date measure of job performance is required for assessing nursing quality. Methods. Test construction involved systematic generation of test items using focus groups, a literature review, and an expert review of test items. A pilot study was conducted to determine the multidimensional nature of the taxonomy and its psychometric properties. All data were collected in 2005. Findings. The final version of the nursing performance taxonomy included 41 behaviours across eight dimensions of job performance. Results from preliminary psychometric investigations suggest that the nursing performance scale has good internal consistency, good convergent validity and good criterion validity. Conclusion. The findings give preliminary support for a new job performance scale as a reliable and valid tool for assessing nursing quality. However, further research using a larger sample and nurses from a broader geographical region is required to cross-validate the measure. This scale may be used to guide hospital managers regarding the quality of nursing care within units and to guide future research in the area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Smartphone technology provides free or inexpensive access to mental health and wellbeing resources. As a result the use of mobile applications for these purposes has increased significantly in recent years. Yet, there is currently no app quality assessment alternative to the popular ‘star’-ratings, which are often unreliable. This presentation describes the development of the Mobile Application Rating Scale (MARS) a new measure for classifying and rating the quality of mobile applications. A review of existing literature on app and web quality identified 25 published papers, conference proceedings, and online resources (published since 1999), which identified 372 explicit quality criteria. Qualitative analysis identified five broad categories of app quality rating criteria: engagement, functionality, aesthetics, information quality, and overall satisfaction, which were refined into the 23-item MARS. Independent ratings of 50 randomly selected mental health and wellbeing mobile apps indicated the MARS had excellent levels of internal consistency (α = 0.92) and inter-rater reliability (ICC = 0.85). The MARS provides practitioners and researchers with an easy-to-use, simple, objective and reliable tool for assessing mobile app quality. It also provides mHealth professionals with a checklist for the design and development of high quality apps.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is considerable interest internationally in developing product libraries to support the use of BIM. Product library initiatives are driven by national bodies, manufacturers and private companies who see their potential. A major issue with the production and distribution of product information for BIM is that separate library objects need to be produced for all of the different software systems that are going to use the library. This increases the cost of populating product libraries and also increases the difficulty in maintaining consistency between the representations for the different software over time. This paper describes a project which uses “software transformation” technology from the field of software engineering to support the definition of a single generic representation of a product which can then be automatically converted to the format required by receiving software. The paper covers the current state of implementation of the product library, the technology underlying the transformations for the currently supported software and the business model for creating a national library in Australia. This is placed within the context of other current product library systems to highlight the differences. The responsibilities of the various actors involved in supporting the product library are also discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a wide variety of drivers for business process modelling initiatives, reaching from business evolution and process optimisation over compliance checking and process certification to process enactment. That, in turn, results in models that differ in content due to serving different purposes. In particular, processes are modelled on different abstraction levels and assume different perspectives. Vertical alignment of process models aims at handling these deviations. While the advantages of such an alignment for inter-model analysis and change propagation are out of question, a number of challenges has still to be addressed. In this paper, we discuss three main challenges for vertical alignment in detail. Against this background, the potential application of techniques from the field of process integration is critically assessed. Based thereon, we identify specific research questions that guide the design of a framework for model alignment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hospital disaster resilience can be defined as “the ability of hospitals to resist, absorb, and respond to the shock of disasters while maintaining and surging essential health services, and then to recover to its original state or adapt to a new one.” This article aims to provide a framework which can be used to comprehensively measure hospital disaster resilience. An evaluation framework for assessing hospital resilience was initially proposed through a systematic literature review and Modified-Delphi consultation. Eight key domains were identified: hospital safety, command, communication and cooperation system, disaster plan, resource stockpile, staff capability, disaster training and drills, emergency services and surge capability, and recovery and adaptation. The data for this study were collected from 41 tertiary hospitals in Shandong Province in China, using a specially designed questionnaire. Factor analysis was conducted to determine the underpinning structure of the framework. It identified a four-factor structure of hospital resilience, namely, emergency medical response capability (F1), disaster management mechanisms (F2), hospital infrastructural safety (F3), and disaster resources (F4). These factors displayed good internal consistency. The overall level of hospital disaster resilience (F) was calculated using the scoring model: F = 0.615F1 + 0.202F2 + 0.103F3 + 0.080F4. This validated framework provides a new way to operationalise the concept of hospital resilience, and it is also a foundation for the further development of the measurement instrument in future studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper develops a semiparametric estimation approach for mixed count regression models based on series expansion for the unknown density of the unobserved heterogeneity. We use the generalized Laguerre series expansion around a gamma baseline density to model unobserved heterogeneity in a Poisson mixture model. We establish the consistency of the estimator and present a computational strategy to implement the proposed estimation techniques in the standard count model as well as in truncated, censored, and zero-inflated count regression models. Monte Carlo evidence shows that the finite sample behavior of the estimator is quite good. The paper applies the method to a model of individual shopping behavior. © 1999 Elsevier Science S.A. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Charities' fundraising financial transactions should be reported in the interests of accountability, and the report should be publicly available. However, research shows that at present there is little consistency in how fundraising is defined or in how such transactions are reported, and little guidance from accounting standards. This report examines whether the current reporting of fundraising in annual financial statements by Australian charities is fit for the purposes of informing the donating public and other stakeholders, whether through the Australian Charities and Not-for-profits Commission’s registry strategy or through other means such as private ratings agencies. The authors endeavour to suggest a way forward if it is not.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Carbon nanorods and graphene-like nanosheets are catalytically synthesized in a hot filament chemical vapor deposition system with and without plasma enhancement, with gold used as a catalyst. The morphological and structural properties of the carbon nanorods and nanosheets are investigated by field-emission scanning electron microscopy, transmission electron microscopy and micro-Raman spectroscopy. It is found that carbon nanorods are formed when a CH4 + H2 + N2 plasma is present while carbon nanosheets are formed in a methane environment without a plasma. The formation of carbon nanorods and carbon nanosheets are analyzed. The results suggest that the formation of carbon nanorods is primarily a precipitation process while the formation of carbon nanosheets is a complex process involving surface-catalysis, surface diffusion and precipitation influenced by the Gibbs–Thomson effect. The electron field emission properties of the carbon nanorods and graphene-like nanosheets are measured under high-vacuum; it is found that the carbon nanosheets have a lower field emission turn-on than the carbon nanorods. These results are important to improve the understanding of formation mechanisms of carbon nanomaterials and contribute to eventual applications of these structures in nanodevices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A high level of control over quantum dot (QD) properties such as size and composition during fabrication is required to precisely tune the eventual electronic properties of the QD. Nanoscale synthesis efforts and theoretical studies of electronic properties are traditionally treated quite separately. In this paper, a combinatorial approach has been taken to relate the process synthesis parameters and the electron confinement properties of the QDs. First, hybrid numerical calculations with different influx parameters for Si1-x Cx QDs were carried out to simulate the changes in carbon content x and size. Second, the ionization energy theory was applied to understand the electronic properties of Si1-x Cx QDs. Third, stoichiometric (x=0.5) silicon carbide QDs were grown by means of inductively coupled plasma-assisted rf magnetron sputtering. Finally, the effect of QD size and elemental composition were then incorporated in the ionization energy theory to explain the evolution of the Si1-x Cx photoluminescence spectra. These results are important for the development of deterministic synthesis approaches of self-assembled nanoscale quantum confinement structures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently, a variety high-aspect-ratio nanostructures have been grown and profiled for various applications ranging from field emission transistors to gene/drug delivery devices. However, fabricating and processing arrays of these structures and determining how changing certain physical parameters affects the final outcome is quite challenging. We have developed several modules that can be used to simulate the processes of various physical vapour deposition systems from precursor interaction in the gas phase to gas-surface interactions and surface processes. In this paper, multi-scale hybrid numerical simulations are used to study how low-temperature non-equilibrium plasmas can be employed in the processing of high-aspect-ratio structures such that the resulting nanostructures have properties suitable for their eventual device application. We show that whilst using plasma techniques is beneficial in many nanofabrication processes, it is especially useful in making dense arrays of high-aspect-ratio nanostructures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Texture information in the iris image is not uniform in discriminatory information content for biometric identity verification. The bits in an iris code obtained from the image differ in their consistency from one sample to another for the same identity. In this work, errors in bit strings are systematically analysed in order to investigate the effect of light-induced and drug-induced pupil dilation and constriction on the consistency of iris texture information. The statistics of bit errors are computed for client and impostor distributions as functions of radius and angle. Under normal conditions, a V-shaped radial trend of decreasing bit errors towards the central region of the iris is obtained for client matching, and it is observed that the distribution of errors as a function of angle is uniform. When iris images are affected by pupil dilation or constriction the radial distribution of bit errors is altered. A decreasing trend from the pupil outwards is observed for constriction, whereas a more uniform trend is observed for dilation. The main increase in bit errors occurs closer to the pupil in both cases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The proliferation of news reports published in online websites and news information sharing among social media users necessitates effective techniques for analysing the image, text and video data related to news topics. This paper presents the first study to classify affective facial images on emerging news topics. The proposed system dynamically monitors and selects the current hot (of great interest) news topics with strong affective interestingness using textual keywords in news articles and social media discussions. Images from the selected hot topics are extracted and classified into three categorized emotions, positive, neutral and negative, based on facial expressions of subjects in the images. Performance evaluations on two facial image datasets collected from real-world resources demonstrate the applicability and effectiveness of the proposed system in affective classification of facial images in news reports. Facial expression shows high consistency with the affective textual content in news reports for positive emotion, while only low correlation has been observed for neutral and negative. The system can be directly used for applications, such as assisting editors in choosing photos with a proper affective semantic for a certain topic during news report preparation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Generally wireless sensor networks rely of many-to-one communication approach for data gathering. This approach is extremely susceptible to sinkhole attack, where an intruder attracts surrounding nodes with unfaithful routing information, and subsequently presents selective forwarding or change the data that carry through it. A sinkhole attack causes an important threat to sensor networks and it should be considered that the sensor nodes are mostly spread out in open areas and of weak computation and battery power. In order to detect the intruder in a sinkhole attack this paper suggests an algorithm which firstly finds a group of suspected nodes by analyzing the consistency of data. Then, the intruder is recognized efficiently in the group by checking the network flow information. The proposed algorithm's performance has been evaluated by using numerical analysis and simulations. Therefore, accuracy and efficiency of algorithm would be verified.