883 resultados para WARFARIN DOSE REQUIREMENTS
Resumo:
The aim of this paper is to report the sensitization of the TL peak appearing at 270 degrees C in the glow curve of natural quartz by using the combined effect of heat-treatments and irradiation with high gamma doses. For this, thirty discs with 6 x 1 mm(2) were prepared from plates parallell to a rhombolledral crystal face. The specimens were separated into four lots according to its TL read out between 160 and 320 degrees C. One lot was submitted to gamma doses of Co-60 radiation starting at 2 kGy and going up until a cumulative dose of 25 kGy. The other three lots were initially heal-treated at 500, 800 and 1000 degrees C and then irradiated with a single dose of 25kGy. The TL response of each lot was determined as a function of test-doses ranging from 0.1 to 30 mGy. As a result, it was observed that heat-treatments themselves did not produce the strong peak at 270 degrees C that was observed after the administration of high gamma doses. This peak is associated with the optical absorption band appearing at 470 rim which is due to the formation of [AlO4]degrees acting as electron-hole recombination centers. The formation of the 270 degrees C peak was preliminary analyzed in relation to aluminum- and oxygen-vacancy-related centers found in crystalline quartz. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
A survey of pediatric radiological examinations was carried out in a reference pediatric hospital of the city of Sao Paulo. in order to investigate the doses to children undergoing conventional X-ray examinations. The results showed that the majority of pediatric patients are below 4 years, and that about 80% of the examinations correspond to chest projections. Doses to typical radiological examinations were measured in vivo with thermoluminescent dosimeters (LiF: Mg, Ti and LiF: Mg, Cu, P) attached to the skin of the children to determine entrance surface dose (ESD). Also homogeneous phantoms were used to obtain ESD to younger children, because the technique uses a so small kVp that the dosimeters would produce an artifact image in the patient radiograph. Four kinds of pediatric examinations were investigated: three conventional examinations (chest, skull and abdomen) and a fluoroscopic procedure (barium swallow). Relevant information about kVp and mAs values used in the examinations was collected, and we discuss how these parameters can affect the ESD. The ESD values measured in this work are compared to reference levels published by the European Commission for pediatric patients. The results obtained (third-quartile of the ESD distribution) for chest AP examinations in three age groups were: 0.056 mGy (2-4 years old); 0,068 mGy (5-9 years old)-. 0.069 mGy (10-15 years old). All of them are below the European reference level (0.100mGy). ESD values measured to the older age group in skull and abdomen AP radiographs (mean values 3.44 and 1.20mGy, respectively) are above the European reference levels (1.5mGy to skull and 1.0 mGy to abdomen). ESD values measured in the barium swallow examination reached 10 mGy in skin regions corresponding to thyroid and esophagus. It was noticed during this survey that some technicians use, improperly, X-ray fluoroscopy in conventional examinations to help them in positioning the patient. The results presented here are a preliminary survey of doses in pediatric radiological examinations and they show that it is necessary to investigate the technical parameters to perform the radiographs. to introduce practices to control pediatric patient`s doses and to improve the personnel training to perform a pediatric examination. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
In order to validate the Geant4 toolkit for dosimetry applications, simulations were performed to calculate conversion coefficients h(10, alpha) from air kerma free-in-air to personal dose equivalent Hp(10, a). The simulations consisted of two parts: the production of X-rays with radiation qualities of narrow and wide spectra, and the interaction of radiation with ICRU tissue-equivalent and ISO water slab phantoms. The half-value layers of the X-ray spectra obtained by simulation were compared with experimental results. Mean energy, spectral resolution, half-value layers and conversion coefficients were compared with ISO reference values. The good agreement between results from simulation and reference data shows that the Geant4 is suitable for dosimetry applications which involve photons with energies in the range of ten to a few hundreds of keV. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
The protective shielding design of a mammography facility requires the knowledge of the scattered radiation by the patient and image receptor components. The shape and intensity of secondary x-ray beams depend on the kVp applied to the x-ray tube, target/filter combination, primary x-ray field size, and scattering angle. Currently, shielding calculations for mammography facilities are performed based on scatter fraction data for Mo/Mo target/filter, even though modern mammography equipment is designed with different anode/filter combinations. In this work we present scatter fraction data evaluated based on the x-ray spectra produced by a Mo/Mo, Mo/Rh and W/Rh target/filter, for 25, 30 and 35 kV tube voltages and scattering angles between 30 and 165 degrees. Three mammography phantoms were irradiated and the scattered radiation was measured with a CdZnTe detector. The primary x-ray spectra were computed with a semiempirical model based on the air kerma and HVL measured with an ionization chamber. The results point out that the scatter fraction values are higher for W/Rh than for Mo/Mo and Mo/Rh, although the primary and scattered air kerma are lower for W/Rh than for Mo/Mo and Mo/Rh target/filter combinations. The scatter fractions computed in this work were applied in a shielding design calculation in order to evaluate shielding requirements for each of these target/filter combinations. Besides, shielding requirements have been evaluated converting the scattered air kerma from mGy/week to mSv/week adopting initially a conversion coefficient from air kerma to effective dose as 1 Sv/Gy and then a mean conversion coefficient specific for the x-ray beam considered. Results show that the thickest barrier should be provided for Mo/Mo target/filter combination. They also point out that the use of the conversion coefficient from air kerma to effective dose as 1 Sv/Gy is conservatively high in the mammography energy range and overestimate the barrier thickness. (c) 2008 American Association of Physicists in Medicine.
Resumo:
The aim of the present study was to evaluate the effects of low-dose therapeutic ionizing radiation on different aesthetic dental materials. Forty five specimens (n = 45) of three different aesthetic restorative materials were prepared and randomly divided into five groups: G1 (control group); G2, G3, G4, G5 experimental groups irradiated respectively with 0.25, 0.50, 0.75, and 1.00 Gy of gamma radiation by the (60)Co teletherapy machine. Chemical analyses were performed using a FT-IR Nicolet 520 spectrophotometer with reflectance diffuse technique. Even a minimal exposition at ionizing radiation in therapeutic doses can provide chemical changes on light-cured composite resins. The three studied restorative materials showed changes after exposure at gamma radiation, however the increase of the radiation dose did not contribute to an increase in this effect.
Resumo:
The aim of task scheduling is to minimize the makespan of applications, exploiting the best possible way to use shared resources. Applications have requirements which call for customized environments for their execution. One way to provide such environments is to use virtualization on demand. This paper presents two schedulers based on integer linear programming which schedule virtual machines (VMs) in grid resources and tasks on these VMs. The schedulers differ from previous work by the joint scheduling of tasks and VMs and by considering the impact of the available bandwidth on the quality of the schedule. Experiments show the efficacy of the schedulers in scenarios with different network configurations.
Resumo:
Two-dimensional and 3D quantitative structure-activity relationships studies were performed on a series of diarylpyridines that acts as cannabinoid receptor ligands by means of hologram quantitative structure-activity relationships and comparative molecular field analysis methods. The quantitative structure-activity relationships models were built using a data set of 52 CB1 ligands that can be used as anti-obesity agents. Significant correlation coefficients (hologram quantitative structure-activity relationships: r 2 = 0.91, q 2 = 0.78; comparative molecular field analysis: r 2 = 0.98, q 2 = 0.77) were obtained, indicating the potential of these 2D and 3D models for untested compounds. The models were then used to predict the potency of an external test set, and the predicted (calculated) values are in good agreement with the experimental results. The final quantitative structure-activity relationships models, along with the information obtained from 2D contribution maps and 3D contour maps, obtained in this study are useful tools for the design of novel CB1 ligands with improved anti-obesity potency.
Resumo:
The aim of this work was to design a set of rules for levodopa infusion dose adjustment in Parkinson’s disease based on a simulation experiments. Using this simulator, optimal infusions dose in different conditions were calculated. There are seven conditions (-3 to +3)appearing in a rating scale for Parkinson’s disease patients. By finding mean of the differences between conditions and optimal dose, two sets of rules were designed. The set of rules was optimized by several testing. Usefulness for optimizing the titration procedure of new infusion patients based on rule-based reasoning was investigated. Results show that both of the number of the steps and the errors for finding optimal dose was shorten by new rules. At last, the dose predicted with new rules well on each single occasion of majority of patients in simulation experiments.
Resumo:
During the development of system requirements, software system specifications are often inconsistent. Inconsistencies may arise for different reasons, for example, when multiple conflicting viewpoints are embodied in the specification, or when the specification itself is at a transient stage of evolution. These inconsistencies cannot always be resolved immediately. As a result, we argue that a formal framework for the analysis of evolving specifications should be able to tolerate inconsistency by allowing reasoning in the presence of inconsistency without trivialisation, and circumvent inconsistency by enabling impact analyses of potential changes to be carried out. This paper shows how clustered belief revision can help in this process. Clustered belief revision allows for the grouping of requirements with similar functionality into clusters and the assignment of priorities between them. By analysing the result of a cluster, an engineer can either choose to rectify problems in the specification or to postpone the changes until more information becomes available.
Resumo:
In e-Science experiments, it is vital to record the experimental process for later use such as in interpreting results, verifying that the correct process took place or tracing where data came from. The process that led to some data is called the provenance of that data, and a provenance architecture is the software architecture for a system that will provide the necessary functionality to record, store and use process documentation. However, there has been little principled analysis of what is actually required of a provenance architecture, so it is impossible to determine the functionality they would ideally support. In this paper, we present use cases for a provenance architecture from current experiments in biology, chemistry, physics and computer science, and analyse the use cases to determine the technical requirements of a generic, technology and application-independent architecture. We propose an architecture that meets these requirements and evaluate a preliminary implementation by attempting to realise two of the use cases.
Resumo:
From where did this tweet originate? Was this quote from the New York Times modified? Daily, we rely on data from the Web but often it is difficult or impossible to determine where it came from or how it was produced. This lack of provenance is particularly evident when people and systems deal with Web information or with any environment where information comes from sources of varying quality. Provenance is not captured pervasively in information systems. There are major technical, social, and economic impediments that stand in the way of using provenance effectively. This paper synthesizes requirements for provenance on the Web for a number of dimensions focusing on three key aspects of provenance: the content of provenance, the management of provenance records, and the uses of provenance information. To illustrate these requirements, we use three synthesized scenarios that encompass provenance problems faced by Web users today.