826 resultados para "quality assurance
Resumo:
Quality management provides to companies a framework to improve quality in overall systems, reduction of costs, reallocation of resources efficiently, correct planning of strategies, prevent or correct errors in the right time and increase the performance of companies. In this text, we discuss the different theories in this field, their obligatory or non-obligatory compliance, the importance of quality management for exporting companies and a case study of a Colombian firm that its main objective is to manage quality. In conclusion, we find out that there is different types of quality management systems such as Juran’s trilogy, Deming 14 points, Six sigma, HACCP, and so on; also that companies have to manage suppliers and that quality has a positive influence on exports volume; in the case of Colombian small and medium enterprises, it can be mentioned that the majority has implemented tools regarding quality management but is not enough.
Resumo:
The occurrence and levels of airborne polycyclic aromatic hydrocarbons and volatile organic compounds in selected non-industrial environments in Brisbane have been investigated as part of an integrated indoor air quality assessment program. The most abundant and most frequently encountered compounds include, nonanal, decanal, texanol, phenol, 2-ethyl-1-hexanol, ethanal, naphthalene, 2,6-tert-butyl-4-methyl-phenol (BHT), salicylaldehyde, toluene, hexanal, benzaldehyde, styrene, ethyl benzene, o-, m- and pxylenes, benzene, n-butanol, 1,2-propandiol, and n-butylacetate. Many of the 64 compounds usually included in the European Collaborative Action method of TVOC analysis were below detection limits in the samples analysed. In order to extract maximum amount of information from the data collected, multivariate data projection methods have been employed. The implications of the information extracted on source identification and exposure control are discussed.
Resumo:
Some polycyclic aromatic hydrocarbons (PAHs) are ubiquitous in air and have been implicated as carcinogenic materials. Therefore, literature is replete with studies that are focused on their occurrence and profiles in indoor and outdoor air samples. However, because the relative potency of individual PAHs vary widely, health risks associated with the presence of PAHs in a particular environment cannot be extrapolated directly from the concentrations of individual PAHs in that environment. In addition, database on the potency of PAH mixtures is currently limited. In this paper, we have utilized multi-criteria decision making methods (MCDMs) to simultaneously correlate PAH-related health risk in some microenvironments to the concentration levels, ethoxyresorufin-O-deethylase (EROD) activity induction equivalency factors and toxic equivalency factors (TEFs) of PAHs found in those microenvironments. The results showed that the relative risk associated with PAHs in different air samples depends on the index used. Nevertheless, this approach offers a promising tool that could help identify microenvironments of concern and assist the prioritisation of control strategies.
Resumo:
This article was written in 1997. After a 2009 review the content was left mostly unchanged - apart from this re-written abstract, restructured headings and a table of contents. The article deals directly with professional registration of surveyors; but it also relates to government procurement of professional services. The issues include public service and professional ethics; setting of professional fees; quality assurance; official corruption; and professional recruitment, education and training. Debate on the Land Surveyors Act 1908 (Qld) and its amendments to 1916 occurred at a time when industrial unrest of the 1890s and common market principles of the new Commonwealth were fresh in peoples’ minds. Industrial issues led to a constitutional crisis in the Queensland’s then bicameral legislature and frustrated a first attempt to pass a Surveyors Bill in 1907. The Bill was re-introduced in 1908 after fresh elections and Kidston’s return as state premier. Co-ordinated immigration and land settlement polices of the colonies were discontinued when the Commonwealth gained power over immigration in 1901. Concerns shifted to protecting jobs from foreign competition. Debate on 1974 amendments to the Act reflected concerns about skill shortages and professional accreditation. However, in times of economic downturn, a so-called ‘chronic shortage of surveyors’ could rapidly degenerate into oversupply and unemployment. Theorists championed a naïve ‘capture theory’ where the professions captured governments to create legislative barriers to entry to the professions. Supposedly, this allowed rent-seeking and monopoly profits through lack of competition. However, historical evidence suggests that governments have been capable of capturing and exploiting surveyors. More enlightened institutional arrangements are needed if the community is to receive benefits commensurate with sizable co-investments of public and private resources in developing human capital.
Resumo:
This paper looks at the challenges presented for the Australian Library and Information Association by its role as the professional association responsible for ensuring the quality of Australian library technician graduates. There is a particular focus on the issue of course recognition, where the Association's role is complicated by the need to work alongside the national quality assurance processes that have been established by the relevant technical education authorities. The paper describes the history of course recognition in Australia; examines the relationship between course recognition and other quality measures; and describes the process the Association has undertaken recently to ensure appropriate professional scrutiny in a changing environment of accountability.
Resumo:
Objective: To summarise the extent to which narrative text fields in administrative health data are used to gather information about the event resulting in presentation to a health care provider for treatment of an injury, and to highlight best practise approaches to conducting narrative text interrogation for injury surveillance purposes.----- Design: Systematic review----- Data sources: Electronic databases searched included CINAHL, Google Scholar, Medline, Proquest, PubMed and PubMed Central.. Snowballing strategies were employed by searching the bibliographies of retrieved references to identify relevant associated articles.----- Selection criteria: Papers were selected if the study used a health-related database and if the study objectives were to a) use text field to identify injury cases or use text fields to extract additional information on injury circumstances not available from coded data or b) use text fields to assess accuracy of coded data fields for injury-related cases or c) describe methods/approaches for extracting injury information from text fields.----- Methods: The papers identified through the search were independently screened by two authors for inclusion, resulting in 41 papers selected for review. Due to heterogeneity between studies metaanalysis was not performed.----- Results: The majority of papers reviewed focused on describing injury epidemiology trends using coded data and text fields to supplement coded data (28 papers), with these studies demonstrating the value of text data for providing more specific information beyond what had been coded to enable case selection or provide circumstantial information. Caveats were expressed in terms of the consistency and completeness of recording of text information resulting in underestimates when using these data. Four coding validation papers were reviewed with these studies showing the utility of text data for validating and checking the accuracy of coded data. Seven studies (9 papers) described methods for interrogating injury text fields for systematic extraction of information, with a combination of manual and semi-automated methods used to refine and develop algorithms for extraction and classification of coded data from text. Quality assurance approaches to assessing the robustness of the methods for extracting text data was only discussed in 8 of the epidemiology papers, and 1 of the coding validation papers. All of the text interrogation methodology papers described systematic approaches to ensuring the quality of the approach.----- Conclusions: Manual review and coding approaches, text search methods, and statistical tools have been utilised to extract data from narrative text and translate it into useable, detailed injury event information. These techniques can and have been applied to administrative datasets to identify specific injury types and add value to previously coded injury datasets. Only a few studies thoroughly described the methods which were used for text mining and less than half of the studies which were reviewed used/described quality assurance methods for ensuring the robustness of the approach. New techniques utilising semi-automated computerised approaches and Bayesian/clustering statistical methods offer the potential to further develop and standardise the analysis of narrative text for injury surveillance.
Resumo:
Aims: To develop clinical protocols for acquiring PET images, performing CT-PET registration and tumour volume definition based on the PET image data, for radiotherapy for lung cancer patients and then to test these protocols with respect to levels of accuracy and reproducibility. Method: A phantom-based quality assurance study of the processes associated with using registered CT and PET scans for tumour volume definition was conducted to: (1) investigate image acquisition and manipulation techniques for registering and contouring CT and PET images in a radiotherapy treatment planning system, and (2) determine technology-based errors in the registration and contouring processes. The outcomes of the phantom image based quality assurance study were used to determine clinical protocols. Protocols were developed for (1) acquiring patient PET image data for incorporation into the 3DCRT process, particularly for ensuring that the patient is positioned in their treatment position; (2) CT-PET image registration techniques and (3) GTV definition using the PET image data. The developed clinical protocols were tested using retrospective clinical trials to assess levels of inter-user variability which may be attributed to the use of these protocols. A Siemens Somatom Open Sensation 20 slice CT scanner and a Philips Allegro stand-alone PET scanner were used to acquire the images for this research. The Philips Pinnacle3 treatment planning system was used to perform the image registration and contouring of the CT and PET images. Results: Both the attenuation-corrected and transmission images obtained from standard whole-body PET staging clinical scanning protocols were acquired and imported into the treatment planning system for the phantom-based quality assurance study. Protocols for manipulating the PET images in the treatment planning system, particularly for quantifying uptake in volumes of interest and window levels for accurate geometric visualisation were determined. The automatic registration algorithms were found to have sub-voxel levels of accuracy, with transmission scan-based CT-PET registration more accurate than emission scan-based registration of the phantom images. Respiration induced image artifacts were not found to influence registration accuracy while inadequate pre-registration over-lap of the CT and PET images was found to result in large registration errors. A threshold value based on a percentage of the maximum uptake within a volume of interest was found to accurately contour the different features of the phantom despite the lower spatial resolution of the PET images. Appropriate selection of the threshold value is dependant on target-to-background ratios and the presence of respiratory motion. The results from the phantom-based study were used to design, implement and test clinical CT-PET fusion protocols. The patient PET image acquisition protocols enabled patients to be successfully identified and positioned in their radiotherapy treatment position during the acquisition of their whole-body PET staging scan. While automatic registration techniques were found to reduce inter-user variation compared to manual techniques, there was no significant difference in the registration outcomes for transmission or emission scan-based registration of the patient images, using the protocol. Tumour volumes contoured on registered patient CT-PET images using the tested threshold values and viewing windows determined from the phantom study, demonstrated less inter-user variation for the primary tumour volume contours than those contoured using only the patient’s planning CT scans. Conclusions: The developed clinical protocols allow a patient’s whole-body PET staging scan to be incorporated, manipulated and quantified in the treatment planning process to improve the accuracy of gross tumour volume localisation in 3D conformal radiotherapy for lung cancer. Image registration protocols which factor in potential software-based errors combined with adequate user training are recommended to increase the accuracy and reproducibility of registration outcomes. A semi-automated adaptive threshold contouring technique incorporating a PET windowing protocol, accurately defines the geometric edge of a tumour volume using PET image data from a stand alone PET scanner, including 4D target volumes.
Resumo:
This dissertation develops the model of a prototype system for the digital lodgement of spatial data sets with statutory bodies responsible for the registration and approval of land related actions under the Torrens Title system. Spatial data pertain to the location of geographical entities together with their spatial dimensions and are classified as point, line, area or surface. This dissertation deals with a sub-set of spatial data, land boundary data that result from the activities performed by surveying and mapping organisations for the development of land parcels. The prototype system has been developed, utilising an event-driven paradigm for the user-interface, to exploit the potential of digital spatial data being generated from the utilisation of electronic techniques. The system provides for the creation of a digital model of the cadastral network and dependent data sets for an area of interest from hard copy records. This initial model is calibrated on registered control and updated by field survey to produce an amended model. The field-calibrated model then is electronically validated to ensure it complies with standards of format and content. The prototype system was designed specifically to create a database of land boundary data for subsequent retrieval by land professionals for surveying, mapping and related activities. Data extracted from this database are utilised for subsequent field survey operations without the need to create an initial digital model of an area of interest. Statistical reporting of differences resulting when subsequent initial and calibrated models are compared, replaces the traditional checking operations of spatial data performed by a land registry office. Digital lodgement of survey data is fundamental to the creation of the database of accurate land boundary data. This creation of the database is fundamental also to the efficient integration of accurate spatial data about land being generated by modem technology such as global positioning systems, and remote sensing and imaging, with land boundary information and other information held in Government databases. The prototype system developed provides for the delivery of accurate, digital land boundary data for the land registration process to ensure the continued maintenance of the integrity of the cadastre. Such data should meet also the more general and encompassing requirements of, and prove to be of tangible, longer term benefit to the developing, electronic land information industry.
Resumo:
A Geant4 based simulation tool has been developed to perform Monte Carlo modelling of a 6 MV VarianTM iX clinac. The computer aided design interface of Geant4 was used to accurately model the LINAC components, including the Millenium multi-leaf collimators (MLCs). The simulation tool was verified via simulation of standard commissioning dosimetry data acquired with an ionisation chamber in a water phantom. Verification of the MLC model was achieved by simulation of leaf leakage measurements performed using GafchromicTM film in a solid water phantom. An absolute dose calibration capability was added by including a virtual monitor chamber into the simulation. Furthermore, a DICOM-RT interface was integrated with the application to allow the simulation of treatment plans in radiotherapy. The ability of the simulation tool to accurately model leaf movements and doses at each control point was verified by simulation of a widely used intensity-modulated radiation therapy (IMRT) quality assurance (QA) technique, the chair test.