993 resultados para SANITARY QUALITY
Resumo:
Several authors stress the importance of data’s crucial foundation for operational, tactical and strategic decisions (e.g., Redman 1998, Tee et al. 2007). Data provides the basis for decision making as data collection and processing is typically associated with reducing uncertainty in order to make more effective decisions (Daft and Lengel 1986). While the first series of investments of Information Systems/Information Technology (IS/IT) into organizations improved data collection, restricted computational capacity and limited processing power created challenges (Simon 1960). Fifty years on, capacity and processing problems are increasingly less relevant; in fact, the opposite exists. Determining data relevance and usefulness is complicated by increased data capture and storage capacity, as well as continual improvements in information processing capability. As the IT landscape changes, businesses are inundated with ever-increasing volumes of data from both internal and external sources available on both an ad-hoc and real-time basis. More data, however, does not necessarily translate into more effective and efficient organizations, nor does it increase the likelihood of better or timelier decisions. This raises questions about what data managers require to assist their decision making processes.
Resumo:
[Quality Management in Construction Projects by Abdul Razzak Rumane, CRC Press, Boca Raton, FL, 2011, 434 pp, ISBN 9781439838716] Issues of quality management, quality control and performance against specification have long been the focus of various business sectors. Recently there has been an additional drive to achieve the continuous improvement and customer satisfaction promised by the 20th-century ‘gurus’ some six or seven decades ago. The engineering and construction industries have generally taken somewhat longer than their counterparts in the manufacturing, service and production sectors to achieve these espoused levels of quality. The construction and engineering sectors stand to realize major rewards from better managing quality in projects. More effort is being put into instructing future participants in the industry as well as assisting existing professionals. This book comes at an opportune time.
Resumo:
Collaborative question answering (cQA) portals such as Yahoo! Answers allow users as askers or answer authors to communicate, and exchange information through the asking and answering of questions in the network. In their current set-up, answers to a question are arranged in chronological order. For effective information retrieval, it will be advantageous to have the users’ answers ranked according to their quality. This paper proposes a novel approach of evaluating and ranking the users’answers and recommending the top-n quality answers to information seekers. The proposed approach is based on a user-reputation method which assigns a score to an answer reflecting its answer author’s reputation level in the network. The proposed approach is evaluated on a dataset collected from a live cQA, namely, Yahoo! Answers. To compare the results obtained by the non-content-based user-reputation method, experiments were also conducted with several content-based methods that assign a score to an answer reflecting its content quality. Various combinations of non-content and content-based scores were also used in comparing results. Empirical analysis shows that the proposed method is able to rank the users’ answers and recommend the top-n answers with good accuracy. Results of the proposed method outperform the content-based methods, various combinations, and the results obtained by the popular link analysis method, HITS.
Resumo:
This thesis presents the outcomes of a comprehensive research study undertaken to investigate the influence of rainfall and catchment characteristics on urban stormwater quality. The knowledge created is expected to contribute to a greater understanding of urban stormwater quality and thereby enhance the design of stormwater quality treatment systems. The research study was undertaken based on selected urban catchments in Gold Coast, Australia. The research methodology included field investigations, laboratory testing, computer modelling and data analysis. Both univariate and multivariate data analysis techniques were used to investigate the influence of rainfall and catchment characteristics on urban stormwater quality. The rainfall characteristics investigated included average rainfall intensity and rainfall duration whilst catchment characteristics included land use, impervious area percentage, urban form and pervious area location. The catchment scale data for the analysis was obtained from four residential catchments, including rainfall-runoff records, drainage network data, stormwater quality data and land use and land cover data. Pollutants build-up samples were collected from twelve road surfaces in residential, commercial and industrial land use areas. The relationships between rainfall characteristics, catchment characteristics and urban stormwater quality were investigated based on residential catchments and then extended to other land uses. Based on the influence rainfall characteristics exert on urban stormwater quality, rainfall events can be classified into three different types, namely, high average intensity-short duration (Type 1), high average intensity-long duration (Type 2) and low average intensity-long duration (Type 3). This provides an innovative approach to conventional modelling which does not commonly relate stormwater quality to rainfall characteristics. Additionally, it was found that the threshold intensity for pollutant wash-off from urban catchments is much less than for rural catchments. High average intensity-short duration rainfall events are cumulatively responsible for the generation of a major fraction of the annual pollutants load compared to the other rainfall event types. Additionally, rainfall events less than 1 year ARI such as 6- month ARI should be considered for treatment design as they generate a significant fraction of the annual runoff volume and by implication a significant fraction of the pollutants load. This implies that stormwater treatment designs based on larger rainfall events would not be feasible in the context of cost-effectiveness, efficiency in treatment performance and possible savings in land area needed. This also suggests that the simulation of long-term continuous rainfall events for stormwater treatment design may not be needed and that event based simulations would be adequate. The investigations into the relationship between catchment characteristics and urban stormwater quality found that other than conventional catchment characteristics such as land use and impervious area percentage, other catchment characteristics such as urban form and pervious area location also play important roles in influencing urban stormwater quality. These outcomes point to the fact that the conventional modelling approach in the design of stormwater quality treatment systems which is commonly based on land use and impervious area percentage would be inadequate. It was also noted that the small uniformly urbanised areas within a larger mixed catchment produce relatively lower variations in stormwater quality and as expected lower runoff volume with the opposite being the case for large mixed use urbanised catchments. Therefore, a decentralised approach to water quality treatment would be more effective rather than an "end-of-pipe" approach. The investigation of pollutants build-up on different land uses showed that pollutant build-up characteristics vary even within the same land use. Therefore, the conventional approach in stormwater quality modelling, which is based solely on land use, may prove to be inappropriate. Industrial land use has relatively higher variability in maximum pollutant build-up, build-up rate and particle size distribution than the other two land uses. However, commercial and residential land uses had relatively higher variations of nutrients and organic carbon build-up. Additionally, it was found that particle size distribution had a relatively higher variability for all three land uses compared to the other build-up parameters. The high variability in particle size distribution for all land uses illustrate the dissimilarities associated with the fine and coarse particle size fractions even within the same land use and hence the variations in stormwater quality in relation to pollutants adsorbing to different sizes of particles.
Resumo:
Prevailing video adaptation solutions change the quality of the video uniformly throughout the whole frame in the bitrate adjustment process; while region-of-interest (ROI)-based solutions selectively retains the quality in the areas of the frame where the viewers are more likely to pay more attention to. ROI-based coding can improve perceptual quality and viewer satisfaction while trading off some bandwidth. However, there has been no comprehensive study to measure the bitrate vs. perceptual quality trade-off so far. The paper proposes an ROI detection scheme for videos, which is characterized with low computational complexity and robustness, and measures the bitrate vs. quality trade-off for ROI-based encoding using a state-of-the-art H.264/AVC encoder to justify the viability of this type of encoding method. The results from the subjective quality test reveal that ROI-based encoding achieves a significant perceptual quality improvement over the encoding with uniform quality at the cost of slightly more bits. Based on the bitrate measurements and subjective quality assessments, the bitrate and the perceptual quality estimation models for non-scalable ROI-based video coding (AVC) are developed, which are found to be similar to the models for scalable video coding (SVC).
Resumo:
To evaluate the effect of soft contact lens type on the in vivo tear film surface quality (TFSQ) on daily disposable lenses and to establish whether two recently developed techniques for noninvasive measurement of TFSQ can distinguish between different contact lens types.
Resumo:
In the past fifteen years, increasing attention has been given to the role of Vocational Education and Training (VET) in attracting large numbers of international students and its contribution to the economic development of Australia. This trend has given rise to many challenges in vocational education, especially with regard to providing quality education that ensures international students’ stay in Australia is a satisfactory experience. Teachers are key stakeholders in international education and share responsibility for ensuring international students gain quality learning experiences and positive outcomes. However, the challenges and needs of these teachers are generally not well understood. Therefore, this paper draws on the dilemmas faced by teachers of international students associated with professional, personal, ethical and educational aspects. This paper reports on a Masters Research project that is designed to investigate the dilemmas that teachers of international students face in VET in Australia, particularly in Brisbane. This study uses a qualitative approach within the interpretive constructivist paradigm to gain real-life insights through responsive interviewing and inductive data analysis. While the data collection has been done, the analysis of data is in progress. Responsive interviews with teachers of VET with different academic and national backgrounds, ages, industry experience have identified particular understandings, ideologies and representations of what it means to be a teacher in today's multicultural VET environment; provoking both resistances and new pedagogical understanding of teacher dilemmas and their work environment through the eyes of teachers of international students. The paper considers the challenges for the VET practitioners within the VET system while reflecting on the theme for the 2011 AVETRA conference, “Research in VET: Janus- Reflecting Back, Projecting Forward” by focusing particularly on “Rethinking pedagogies and pathways in VET work through the voice of VET workers”.
Resumo:
Background Although physical activity is associated with health-related quality of life (HRQL), the nature of the dose-response relationship remains unclear. This study examined the concurrent and prospective dose-response relationships between total physical activity (TPA) and (only) walking with HRQL in two age cohorts of women. Methods Participants were 10,698 women born in 1946-1951 and 7,646 born in 1921-1926, who completed three mailed surveys for the Australian Longitudinal Study on Women's Health. They reported weekly TPA minutes (sum of walking, moderate, and vigorous minutes). HRQL was measured with the Medical Outcomes Study Short-Form 36 Health Status Survey (SF-36). Linear mixed models, adjusted for socio-demographic and health-related variables, were used to examine associations between TPA level (none, very low, low, intermediate, sufficient, high, and very high) and SF-36 scores. For women who reported walking as their only physical activity, associations between walking and SF-36 scores were also examined. Results Curvilinear trends were observed between TPA and walking with SF-36 scores. Concurrently, HRQL scores increased significantly with increasing TPA and walking, in both cohorts, with increases less marked above sufficient activity levels. Prospectively, associations were attenuated although significant and meaningful improvements in physical functioning and vitality were observed across most TPA and walking categories above the low category. Conclusion For women in their 50s-80s without clinical depression, greater amounts of TPA are associated with better current and future HRQL, particularly physical functioning and vitality. Even if walking is their only activity, women, particularly those in their 70s-80s, have better health-related quality of life.
Resumo:
Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.
Resumo:
Until recently, standards to guide nursing education and practice in Vietnam were nonexistent. This paper describes the development and implementation of a clinical teaching capacity building project piloted in Hanoi, Vietnam. The project was part of a multi-component capacity building program designed to improve nurse education in Vietnam. Objectives of the project were to develop a collaborative clinically-based teaching model that encourages evidence-based, student-centred clinical learning. The model incorporated strategies to promote development of nursing practice to meet national competency standards. Thirty nurse teachers from two organisations in Hanoi participated in the program. These participants attended three workshops, and completed applied assessments, where participants implemented concepts from each workshop. The assessment tasks were planning, implementing and evaluating clinical teaching. On completion of the workshops, twenty participants undertook a study tour in Australia to refine the teaching model and develop an action plan for model implementation in both organisations, with an aim to disseminate the model across Vietnam. Significant changes accredited to this project have been noted on an individual and organisational level. Dissemination of this clinical teaching model has commenced in Ho Chi Minh, with further plans for more in-depth dissemination to occur throughout the country.
Resumo:
Urban stormwater quality is multifaceted and the use of a limited number of factors to represent catchment characteristics may not be adequate to explain the complexity of water quality response to a rainfall event or site-to-site differences in stormwater quality modelling. This paper presents the outcomes of a research study which investigated the adequacy of using land use and impervious area fraction only, to represent catchment characteristics in urban stormwater quality modelling. The research outcomes confirmed the inadequacy of the use of these two parameters alone to represent urban catchment characteristics in stormwater quality prediction. Urban form also needs to be taken into consideration as it was found have an important impact on stormwater quality by influencing pollutant generation, build-up and wash-off. Urban form refers to characteristics related to an urban development such as road layout, spatial distribution of urban areas and urban design features.
Resumo:
It would be a rare thing to visit an early years setting or classroom in Australia that does not display examples of young children’s artworks. This practice serves to give schools a particular ‘look’, but is no guarantee of quality art education. The Australian National Review of Visual Arts Education (NRVE) (2009) has called for changes to visual art education in schools. The planned new National Curriculum includes the arts (music, dance, drama, media and visual arts) as one of the five learning areas. Research shows that it is the classroom teacher that makes the difference, and teacher education has a large part to play in reforms to art education. This paper provides an account of one foundation unit of study (Unit 1) for first year university students enrolled in a 4-year Bachelor degree program who are preparing to teach in the early years (0–8 years). To prepare pre-service teachers to meet the needs of children in the 21st century, Unit 1 blends old and new ways of seeing art, child and pedagogy. Claims for the effectiveness of this model are supported with evidence-based research, conducted over the six years of iterations and ongoing development of Unit 1.
Resumo:
Objectives: To measure tear film surface quality (TFSQ) using dynamic high-speed videokeratoscopy during short-term (8 hours) use of rigid and soft contact lenses. Methods: A group of fourteen subjects wore 3 different types of contact lenses on 3 different non-consecutive days (order randomized) in one eye only. Subjects were screened to exclude those with dry eye. The lenses included a PMMA hard, an RGP (Boston XO) and a soft silicone hydrogel lens. Three 30 second long high speed videokeratoscopy recordings were taken with contact lenses in-situ, in the morning and again after 8 hours of contact lens wear, both in normal and suppressed blinking conditions. Recordings were also made on a baseline day with no contact lens wear. Results: The presence of a contact lens in the eye had a significant effect on the mean TFSQ in both natural and suppressed blinking conditions (p=0.001 and p=0.01 respectively, repeated measures ANOVA). TFSQ was worse with all the lenses compared to no lens in the eye (in the afternoon during both normal and suppressed blinking conditions (all p<0.05). In natural blinking conditions, the mean TFSQ for the PMMA and RGP lenses was significantly worse than the baseline day (no lens) for both morning and afternoon measures (p<0.05). Conclusions: This study shows that both rigid and soft contact lenses adversely affect the TFSQ in both natural and suppressed blinking conditions. No significant differences were found between the lens types and materials. Keywords: Tear film surface quality, rigid contact lens, soft contact lens, dynamic high-speed videokeratoscopy
Resumo:
The National Road Safety Strategy 2011-2020 outlines plans to reduce the burden of road trauma via improvements and interventions relating to safe roads, safe speeds, safe vehicles, and safe people. It also highlights that a key aspect in achieving these goals is the availability of comprehensive data on the issue. The use of data is essential so that more in-depth epidemiologic studies of risk can be conducted as well as to allow effective evaluation of road safety interventions and programs. Before utilising data to evaluate the efficacy of prevention programs it is important for a systematic evaluation of the quality of underlying data sources to be undertaken to ensure any trends which are identified reflect true estimates rather than spurious data effects. However, there has been little scientific work specifically focused on establishing core data quality characteristics pertinent to the road safety field and limited work undertaken to develop methods for evaluating data sources according to these core characteristics. There are a variety of data sources in which traffic-related incidents and resulting injuries are recorded, which are collected for a variety of defined purposes. These include police reports, transport safety databases, emergency department data, hospital morbidity data and mortality data to name a few. However, as these data are collected for specific purposes, each of these data sources suffers from some limitations when seeking to gain a complete picture of the problem. Limitations of current data sources include: delays in data being available, lack of accurate and/or specific location information, and an underreporting of crashes involving particular road user groups such as cyclists. This paper proposes core data quality characteristics that could be used to systematically assess road crash data sources to provide a standardised approach for evaluating data quality in the road safety field. The potential for data linkage to qualitatively and quantitatively improve the quality and comprehensiveness of road crash data is also discussed.