887 resultados para Quality models
Resumo:
After the recent prolonged drought conditions in many parts of Australia it is increasingly recognised that many groundwater systems are under stress. Although this is obvious for systems that are utilised for intensive irrigation many other groundwater systems are also impacted.Management strategies are highly variable to non-existent. Policy and regulation are also often inadequate, and are reactive or politically driven. In addition, there is a wide range of opinion by water users and other stakeholders as to what is “reasonable”management practice. These differences are often related to the “value”that is put on the groundwater resource. Opinions vary from “our right to free water”to an awareness that without effective management the resource will be degraded. There is also often misunderstanding of surface water-groundwater linkages, recharge processes, and baseflow to drainage systems.
Resumo:
We evaluate the performance of several specification tests for Markov regime-switching time-series models. We consider the Lagrange multiplier (LM) and dynamic specification tests of Hamilton (1996) and Ljung–Box tests based on both the generalized residual and a standard-normal residual constructed using the Rosenblatt transformation. The size and power of the tests are studied using Monte Carlo experiments. We find that the LM tests have the best size and power properties. The Ljung–Box tests exhibit slight size distortions, though tests based on the Rosenblatt transformation perform better than the generalized residual-based tests. The tests exhibit impressive power to detect both autocorrelation and autoregressive conditional heteroscedasticity (ARCH). The tests are illustrated with a Markov-switching generalized ARCH (GARCH) model fitted to the US dollar–British pound exchange rate, with the finding that both autocorrelation and GARCH effects are needed to adequately fit the data.
Resumo:
Effective management of groundwater requires stakeholders to have a realistic conceptual understanding of the groundwater systems and hydrological processes.However, groundwater data can be complex, confusing and often difficult for people to comprehend..A powerful way to communicate understanding of groundwater processes, complex subsurface geology and their relationships is through the use of visualisation techniques to create 3D conceptual groundwater models. In addition, the ability to animate, interrogate and interact with 3D models can encourage a higher level of understanding than static images alone. While there are increasing numbers of software tools available for developing and visualising groundwater conceptual models, these packages are often very expensive and are not readily accessible to majority people due to complexity. .The Groundwater Visualisation System (GVS) is a software framework that can be used to develop groundwater visualisation tools aimed specifically at non-technical computer users and those who are not groundwater domain experts. A primary aim of GVS is to provide management support for agencies, and enhancecommunity understanding.
Resumo:
Introduction The purpose of this study was to develop, implement and evaluate the impact of an educational intervention, comprising an innovative model of clinical decisionmaking and educational delivery strategy for facilitating nursing students‘ learning and development of competence in paediatric physical assessment practices. Background of the study Nursing students have an undergraduate education that aims to produce graduates of a generalist nature who demonstrate entry level competence for providing nursing care in a variety of health settings. Consistent with population morbidity and health care roles, paediatric nursing concepts typically form a comparatively small part of undergraduate curricula and students‘ exposure to paediatric physical assessment concepts and principles are brief. However, the nursing shortage has changed traditional nursing employment patterns and new graduates form the majority of the recruitment pool for paediatric nursing speciality staff. Paediatric nursing is a popular career choice for graduates and anecdotal evidence suggests that nursing students who select a clinical placement in their final year intend to seek employment in paediatrics upon graduation. Although concepts of paediatric nursing are included within undergraduate curriculum, students‘ ability to develop the required habits of mind to practice in what is still regarded as a speciality area of practice is somewhat limited. One of the areas of practice where this particularly impacts is in paediatric nursing physical assessment. Physical assessment is a fundamental component of nursing practice and competence in this area of practice is central to nursing students‘ development of clinical capability for practice as a registered nurse. Timely recognition of physiologic deterioration of patients is a key outcome of nurses‘ competent use of physical assessment strategies, regardless of the practice context. In paediatric nursing contexts children‘s physical assessment practices must specifically accommodate the child‘s different physiological composition, function and pattern of clinical deterioration (Hockenberry & Barrera, 2007). Thus, to effectively manage physical assessment of patients within the paediatric practice setting nursing students need to integrate paediatric nursing theory into their practice. This requires significant information processing and it is in this process where students are frequently challenged. The provision of rules or models can guide practice and assist novice-level nurses to develop their capabilities (Benner, 1984; Benner, Hooper-Kyriakidis & Stannard, 1999). Nursing practice models are cognitive tools that represent simplified patterns of expert analysis employing concepts that suit the limited reasoning of the inexperienced, and can represent the =rules‘ referred to by Benner (1984). Without a practice model of physical assessment students are likely to be uncertain about how to proceed with data collection, the interpretation of paediatric clinical findings and the appraisal of findings. These circumstances can result in ad hoc and unreliable nursing physical assessment that forms a poor basis for nursing decisions. The educational intervention developed as part of this study sought to resolve this problem and support nursing students‘ development of competence in paediatric physical assessment. Methods This study utilised the Context Input Process Product (CIPP) Model by Stufflebeam (2004) as the theoretical framework that underpinned the research design and evaluation methodology. Each of the four elements in the CIPP model were utilised to guide discrete stages of this study. The Context element informed design of the clinical decision-making process, the Paediatric Nursing Physical Assessment model. The Input element was utilised in appraising relevant literature, identifying an appropriate instructional methodology to facilitate learning and educational intervention delivery to undergraduate nursing students, and development of program content (the CD-ROM kit). Study One employed the Process element and used expert panel approaches to review and refine instructional methods, identifying potential barriers to obtaining an effective evaluation outcome. The Product element guided design and implementation of Study Two, which was conducted in two phases. Phase One employed a quasiexperimental between-subjects methodology to evaluate the impact of the educational intervention on nursing students‘ clinical performance and selfappraisal of practices in paediatric physical assessment. Phase Two employed a thematic analysis and explored the experiences and perspectives of a sample subgroup of nursing students who used the PNPA CD-ROM kit as preparation for paediatric clinical placement. Results Results from the Process review in Study One indicated that the prototype CDROM kit containing the PNPA model met the predetermined benchmarks for face validity and the impact evaluation instrumentation had adequate content validity in comparison with predetermined benchmarks. In the first phase of Study Two the educational intervention did not result in statistically significant differences in measures of student performance or self-appraisal of practice. However, in Phase Two qualitative commentary from students, and from the expert panel who reviewed the prototype CD-ROM kit (Study One, Phase One), strongly endorsed the quality of the intervention and its potential for supporting learning. This raises questions regarding transfer of learning and it is likely that, within this study, several factors have influenced students‘ transfer of learning from the educational intervention to the clinical practice environment, where outcomes were measured. Conclusion In summary, the educational intervention employed in this study provides insights into the potential e-learning approaches offer for delivering authentic learning experiences to undergraduate nursing students. Findings in this study raise important questions regarding possible pedagogical influences on learning outcomes, issues within the transfer of theory to practice and factors that may have influenced findings within the context of this study. This study makes a unique contribution to nursing education, specifically with respect to progressing an understanding of the challenges faced in employing instructive methods to impact upon nursing students‘ development of competence. The important contribution transfer of learning processes make to students‘ transition into the professional practice context and to their development of competence within the context of speciality practice is also highlighted. This study contributes to a greater awareness of the complexity of translating theoretical learning at undergraduate level into clinical practice, particularly within speciality contexts.
Resumo:
Interferometry is a sensitive technique for recording tear film surface irregularities in a noninvasive manner. At the same time, the technique is hindered by natural eye movements resulting in measurement noise. Estimating tear film surface quality from interferograms can be reduced to a spatial-average-localized weighted estimate of the first harmonic of the interference fringes. However, previously reported estimation techniques proved to perform poorly in cases where the pattern fringes were significantly disturbed. This can occur in cases of measuring tear film surface quality on a contact lens on the eye or in a dry eye. We present a new estimation technique for extracting the first harmonic from the interference fringes that combines the traditional spectral estimation techniques with morphological image processing techniques. The proposed technique proves to be more robust to changes in interference fringes caused by natural eye movements and the degree of dryness of the contact lens and corneal surfaces than its predecessors, resulting in tear film surface quality estimates that are less noisy
Resumo:
Mainstream business process modelling techniques promote a design paradigm wherein the activities to be performed within a case, together with their usual execution order, form the backbone of a process model, on top of which other aspects are anchored. This paradigm, while eective in standardised and production-oriented domains, shows some limitations when confronted with processes where case-by-case variations and exceptions are the norm. In this thesis we develop the idea that the eective design of exible process models calls for an alternative modelling paradigm, one in which process models are modularised along key business objects, rather than along activity decompositions. The research follows a design science method, starting from the formulation of a research problem expressed in terms of requirements, and culminating in a set of artifacts that have been devised to satisfy these requirements. The main contributions of the thesis are: (i) a meta-model for object-centric process modelling incorporating constructs for capturing exible processes; (ii) a transformation from this meta-model to an existing activity-centric process modelling language, namely YAWL, showing the relation between object-centric and activity-centric process modelling approaches; and (iii) a Coloured Petri Net that captures the semantics of the proposed meta-model. The meta-model has been evaluated using a framework consisting of a set of work ow patterns. Moreover, the meta-model has been embodied in a modelling tool that has been used to capture two industrial scenarios.
Resumo:
Quality has been an important factor for shopping centers in competitive conditions. However, quality measurement has no standard. In Surabaya, only two regional shopping centers will be measured in this research. The objective is assessing quality of shopping centers building using Analytical Hierarchy Process (AHP) method and calculating the Building Quality Index. An overall ranking of Hierarchy priorities of quality criteria founded as a result from AHP analysis. Access and Circulation became the highest priority in affecting quality of shopping centers building according to respondents’ perception of quality. Weightened value as a result from comparison between two shopping centers as follows: Tunjungan Plaza get 0,732 point and Surabaya Plaza get 0,268 point. The first shopping center got higher weight than the second shopping center. The BQI for Tunjungan Plaza is 66% and for Surabaya Plaza is 64%.
Resumo:
This paper shows how the power quality can be improved in a microgrid that is supplying a nonlinear and unbalanced load. The microgrid contains a hybrid combination of inertial and converter interfaced distributed generation units where a decentralized power sharing algorithm is used to control its power management. One of the distributed generators in the microgrid is used as a power quality compensator for the unbalanced and harmonic load. The current reference generation for power quality improvement takes into account the active and reactive power to be supplied by the micro source which is connected to the compensator. Depending on the power requirement of the nonlinear load, the proposed control scheme can change modes of operation without any external communication interfaces. The compensator can operate in two modes depending on the entire power demand of the unbalanced nonlinear load. The proposed control scheme can even compensate system unbalance caused by the single-phase micro sources and load changes. The efficacy of the proposed power quality improvement control and method in such a microgrid is validated through extensive simulation studies using PSCAD/EMTDC software with detailed dynamic models of the micro sources and power electronic converters
Resumo:
This paper describes control methods for proper load sharing between parallel converters connected in a microgrid and supplied by distributed generators (DGs). It is assumed that the microgrid spans a large area and it supplies loads in both in grid connected and islanded modes. A control strategy is proposed to improve power quality and proper load sharing in both islanded and grid connected modes. It is assumed that each of the DGs has a local load connected to it which can be unbalanced and/or nonlinear. The DGs compensate the effects of unbalance and nonlinearity of the local loads. Common loads are also connected to the microgrid, which are supplied by the utility grid under normal conditions. However during islanding, each of the DGs supplies its local load and shares the common load through droop characteristics. Both impedance and motor loads are considered to verify the system response. The efficacy of the controller has been validated through simulation for various operating conditions using PSCAD. It has been found through simulation that the total Harmonic Distortion (THD) of the of the microgrid voltage is about 10% and the negative and zero sequence component are around 20% of the positive sequence component before compensation. After compensation, the THD remain below 0.5%, whereas, negative and zero sequence components of the voltages remain below 0.02% of the positive sequence component.
Resumo:
Modern computer graphics systems are able to construct renderings of such high quality that viewers are deceived into regarding the images as coming from a photographic source. Large amounts of computing resources are expended in this rendering process, using complex mathematical models of lighting and shading. However, psychophysical experiments have revealed that viewers only regard certain informative regions within a presented image. Furthermore, it has been shown that these visually important regions contain low-level visual feature differences that attract the attention of the viewer. This thesis will present a new approach to image synthesis that exploits these experimental findings by modulating the spatial quality of image regions by their visual importance. Efficiency gains are therefore reaped, without sacrificing much of the perceived quality of the image. Two tasks must be undertaken to achieve this goal. Firstly, the design of an appropriate region-based model of visual importance, and secondly, the modification of progressive rendering techniques to effect an importance-based rendering approach. A rule-based fuzzy logic model is presented that computes, using spatial feature differences, the relative visual importance of regions in an image. This model improves upon previous work by incorporating threshold effects induced by global feature difference distributions and by using texture concentration measures. A modified approach to progressive ray-tracing is also presented. This new approach uses the visual importance model to guide the progressive refinement of an image. In addition, this concept of visual importance has been incorporated into supersampling, texture mapping and computer animation techniques. Experimental results are presented, illustrating the efficiency gains reaped from using this method of progressive rendering. This visual importance-based rendering approach is expected to have applications in the entertainment industry, where image fidelity may be sacrificed for efficiency purposes, as long as the overall visual impression of the scene is maintained. Different aspects of the approach should find many other applications in image compression, image retrieval, progressive data transmission and active robotic vision.
Resumo:
This article presents a survey of authorisation models and considers their ‘fitness-for-purpose’ in facilitating information sharing. Network-supported information sharing is an important technical capability that underpins collaboration in support of dynamic and unpredictable activities such as emergency response, national security, infrastructure protection, supply chain integration and emerging business models based on the concept of a ‘virtual organisation’. The article argues that present authorisation models are inflexible and poorly scalable in such dynamic environments due to their assumption that the future needs of the system can be predicted, which in turn justifies the use of persistent authorisation policies. The article outlines the motivation and requirement for a new flexible authorisation model that addresses the needs of information sharing. It proposes that a flexible and scalable authorisation model must allow an explicit specification of the objectives of the system and access decisions must be made based on a late trade-off analysis between these explicit objectives. A research agenda for the proposed Objective-based Access Control concept is presented.
Resumo:
The type and quality of youth identities ascribed to young people living in residual housing areas present opportunities for action as well as structural constraints. In this book three ethnographies, based on a youth work practitioner's observations, interviews and participation in local networks, identify young people's resistant identities. Through an analysis of social exclusion, youth policies and interviews with young people, youth workers and their managers, the book outlines a contingent network of relationships that hinder informal learning. Globalisation, individualisation, welfare/education reform and the rise of cultural social movements act upon youth identities and steer youth policies to subordinate the notion of informal group learning. Drawing on Castells' and Touraine's sociological models of identity, the book explores youth as a category of time and residual housing areas as a category of space, as they pertain to local dynamics of social exclusion.
Resumo:
Games and related virtual environments have been a much-hyped area of the entertainment industry. The classic quote is that games are now approaching the size of Hollywood box office sales [1]. Books are now appearing that talk up the influence of games on business [2], and it is one of the key drivers of present hardware development. Some of this 3D technology is now embedded right down at the operating system level via the Windows Presentation Foundations – hit Windows/Tab on your Vista box to find out... In addition to this continued growth in the area of games, there are a number of factors that impact its development in the business community. Firstly, the average age of gamers is approaching the mid thirties. Therefore, a number of people who are in management positions in large enterprises are experienced in using 3D entertainment environments. Secondly, due to the pressure of demand for more computational power in both CPU and Graphical Processing Units (GPUs), your average desktop, any decent laptop, can run a game or virtual environment. In fact, the demonstrations at the end of this paper were developed at the Queensland University of Technology (QUT) on a standard Software Operating Environment, with an Intel Dual Core CPU and basic Intel graphics option. What this means is that the potential exists for the easy uptake of such technology due to 1. a broad range of workers being regularly exposed to 3D virtual environment software via games; 2. present desktop computing power now strong enough to potentially roll out a virtual environment solution across an entire enterprise. We believe such visual simulation environments can have a great impact in the area of business process modeling. Accordingly, in this article we will outline the communication capabilities of such environments, giving fantastic possibilities for business process modeling applications, where enterprises need to create, manage, and improve their business processes, and then communicate their processes to stakeholders, both process and non-process cognizant. The article then concludes with a demonstration of the work we are doing in this area at QUT.
Resumo:
The performance of iris recognition systems is significantly affected by the segmentation accuracy, especially in non- ideal iris images. This paper proposes an improved method to localise non-circular iris images quickly and accurately. Shrinking and expanding active contour methods are consolidated when localising inner and outer iris boundaries. First, the pupil region is roughly estimated based on histogram thresholding and morphological operations. There- after, a shrinking active contour model is used to precisely locate the inner iris boundary. Finally, the estimated inner iris boundary is used as an initial contour for an expanding active contour scheme to find the outer iris boundary. The proposed scheme is robust in finding exact the iris boundaries of non-circular and off-angle irises. In addition, occlusions of the iris images from eyelids and eyelashes are automatically excluded from the detected iris region. Experimental results on CASIA v3.0 iris databases indicate the accuracy of proposed technique.