444 resultados para Graphic consistency


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we describe the design of DNA Jewellery, which is a wearable tangible data representation of personal DNA profile data. An iterative design process was followed to develop a 3D form-language that could be mapped to standard DNA profile data, with the aim of retaining readability of data while also producing an aesthetically pleasing and unique result in the area of personalized design. The work explores design issues with the production of data tangibles, contributes to a growing body of research exploring tangible representations of data and highlights the importance of approaches that move between technology, art and design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reframe is changing our approach to the evaluation of courses, units, teaching and student experience at QUT. This graphic image represents the evaluation framework and its purpose in a single page.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analysis of behavioural consistency is an important aspect of software engineering. In process and service management, consistency verification of behavioural models has manifold applications. For instance, a business process model used as system specification and a corresponding workflow model used as implementation have to be consistent. Another example would be the analysis to what degree a process log of executed business operations is consistent with the corresponding normative process model. Typically, existing notions of behaviour equivalence, such as bisimulation and trace equivalence, are applied as consistency notions. Still, these notions are exponential in computation and yield a Boolean result. In many cases, however, a quantification of behavioural deviation is needed along with concepts to isolate the source of deviation. In this article, we propose causal behavioural profiles as the basis for a consistency notion. These profiles capture essential behavioural information, such as order, exclusiveness, and causality between pairs of activities of a process model. Consistency based on these profiles is weaker than trace equivalence, but can be computed efficiently for a broad class of models. In this article, we introduce techniques for the computation of causal behavioural profiles using structural decomposition techniques for sound free-choice workflow systems if unstructured net fragments are acyclic or can be traced back to S- or T-nets. We also elaborate on the findings of applying our technique to three industry model collections.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identification of behavioural contradictions is an important aspect of software engineering, in particular for checking the consistency between a business process model used as system specification and a corresponding workflow model used as implementation. In this paper, we propose causal behavioural profiles as the basis for a consistency notion, which capture essential behavioural information, such as order, exclusiveness, and causality between pairs of activities. Existing notions of behavioural equivalence, such as bisimulation and trace equivalence, might also be applied as consistency notions. Still, they are exponential in computation. Our novel concept of causal behavioural profiles provides a weaker behavioural consistency notion that can be computed efficiently using structural decomposition techniques for sound free-choice workflow systems if unstructured net fragments are acyclic or can be traced back to S- or T-nets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multiculturalism in design advocates that people's belief and cultures should be placed at the centre of design processes and designers should be capable of addressing cultural diversity in multiple ways of thinking. However, contemporary design discourses seem not to be philosophically inclusive and practically applicable in mulriculmral contexts. This paper theoretically reviews three predominant metaphysical conceptual thinking frameworks: Dualism, Monism and Holism in many multicultural societies and argue a possibility of the culturally inclusive design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Child Feeding Questionnaire (CFQ) developed by Birch and colleagues (2001) is a widely used tool for measuring parental feeding beliefs, attitudes and practices. However, the appropriateness of the CFQ for use with Chinese populations is unknown. This study tested the construct validity of a novel Chinese version of the CFQ using confirmatory factor analysis (CFA). Participants included a convenience sample of 254 Chinese-Australian mothers of children aged 1-4 years. Prior to testing, the questionnaire was translated into Chinese using a translation-back-translation method, one item was re-worded to be culturally appropriate, a new item was added (monitoring), and five items that were not age-appropriate for the sample were removed. Based on previous literature, both a 7-factor and an 8-factor model were assessed via CFA. Results showed that the 8-factor model, which separated restriction and use of food rewards, improved the conceptual clarity of the constructs and provided a good fit to the data. Internal consistency of all eight factors was acceptable (Cronbach’s α: .60−.93). This modified 8-factor CFQ appears to be a linguistically and culturally appropriate instrument for assessing feeding beliefs and practices in Chinese-Australian mothers of young children.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Quality of life (QOL) measures are an important patient-relevant outcome measure for clinical studies. Currently there is no fully validated cough-specific QOL measure for paediatrics. The objective of this study was to validate a cough-specific QOL questionnaire for paediatric use. Method 43 children (28 males, 15 females; median age 29 months, IQR 20–41 months) newly referred for chronic cough participated. One parent of each child completed the 27-item Parent Cough-Specific QOL questionnaire (PC-QOL), and the generic child (Pediatric QOL Inventory 4.0 (PedsQL)) and parent QOL questionnaires (SF-12) and two cough-related measures (visual analogue score and verbal category descriptive score) on two occasions separated by 2–3 weeks. Cough counts were also objectively measured on both occasions. Results Internal consistency for both the domains and total PC-QOL at both test times was excellent (Cronbach alpha range 0.70–0.97). Evidence for repeatability and criterion validity was established, with significant correlations over time and significant relationships with the cough measures. The PC-QOL was sensitive to change across the test times and these changes were significantly related to changes in cough measures (PC-QOL with: verbal category descriptive score, rs=−0.37, p=0.016; visual analogue score, rs=−0.47, p=0.003). Significant correlations of the difference scores for the social domain of the PC-QOL and the domain and total scores of the PedsQL were also noted (rs=0.46, p=0.034). Conclusion The PC-QOL is a reliable and valid outcome measure that assesses QOL related to childhood cough at a given time point and measures changes in cough-specific QOL over time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction The dose to skin surface is an important factor for many radiotherapy treatment techniques. It is known that TPS predicted surface doses can be significantly different from actual ICRP skin doses as defined at 70 lm. A number of methods have been implemented for the accurate determination of surface dose including use of specific dosimeters such as TLDs and radiochromic film as well as Monte Carlo calculations. Stereotactic radiosurgery involves delivering very high doses per treatment fraction using small X-ray fields. To date, there has been limited data on surface doses for these very small field sizes. The purpose of this work is to evaluate surface doses by both measurements and Monte Carlo calculations for very small field sizes. Methods All measurements were performed on a Novalis Tx linear accelerator which has a 6 MV SRS X-ray beam mode which uses a specially thin flattening filter. Beam collimation was achieved by circular cones with apertures that gave field sizes ranging from 4 to 30 mm at the isocentre. The relative surface doses were measured using Gafchromic EBT3 film which has the active layer at a depth similar to the ICRP skin dose depth. Monte Carlo calculations were performed using the BEAMnrc/EGSnrc Monte Carlo codes (V4 r225). The specifications of the linear accelerator, including the collimator, were provided by the manufacturer. Optimisation of the incident X-ray beam was achieved by an iterative adjustment of the energy, spatial distribution and radial spread of the incident electron beam striking the target. The energy cutoff parameters were PCUT = 0.01 MeV and ECUT = 0.700 - MeV. Directional bremsstrahlung splitting was switched on for all BEAMnrc calculations. Relative surface doses were determined in a layer defined in a water phantom of the same thickness and depth as compared to the active later in the film. Results Measured surface doses using the EBT3 film varied between 13 and 16 % for the different cones with an uncertainty of 3 %. Monte Carlo calculated surface doses were in agreement to better than 2 % to the measured doses for all the treatment cones. Discussion and conclusions This work has shown the consistency of surface dose measurements using EBT3 film with Monte Carlo predicted values within the uncertainty of the measurements. As such, EBT3 film is recommended for in vivo surface dose measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction The consistency of measuring small field output factors is greatly increased by reporting the measured dosimetric field size of each factor, as opposed to simply stating the nominal field size [1] and therefore requires the measurement of cross-axis profiles in a water tank. However, this makes output factor measurements time consuming. This project establishes at which field size the accuracy of output factors are not affected by the use of potentially inaccurate nominal field sizes, which we believe establishes a practical working definition of a ‘small’ field. The physical components of the radiation beam that contribute to the rapid change in output factor at small field sizes are examined in detail. The physical interaction that dominates the cause of the rapid dose reduction is quantified, and leads to the establishment of a theoretical definition of a ‘small’ field. Methods Current recommendations suggest that radiation collimation systems and isocentre defining lasers should both be calibrated to permit a maximum positioning uncertainty of 1 mm [2]. The proposed practical definition for small field sizes is as follows: if the output factor changes by ±1.0 % given a change in either field size or detector position of up to ±1 mm then the field should be considered small. Monte Carlo modelling was used to simulate output factors of a 6 MV photon beam for square fields with side lengths from 4.0 to 20.0 mm in 1.0 mm increments. The dose was scored to a 0.5 mm wide and 2.0 mm deep cylindrical volume of water within a cubic water phantom, at a depth of 5 cm and SSD of 95 cm. The maximum difference due to a collimator error of ±1 mm was found by comparing the output factors of adjacent field sizes. The output factor simulations were repeated 1 mm off-axis to quantify the effect of detector misalignment. Further simulations separated the total output factor into collimator scatter factor and phantom scatter factor. The collimator scatter factor was further separated into primary source occlusion effects and ‘traditional’ effects (a combination of flattening filter and jaw scatter etc.). The phantom scatter was separated in photon scatter and electronic disequilibrium. Each of these factors was plotted as a function of field size in order to quantify how each affected the change in small field size. Results The use of our practical definition resulted in field sizes of 15 mm or less being characterised as ‘small’. The change in field size had a greater effect than that of detector misalignment. For field sizes of 12 mm or less, electronic disequilibrium was found to cause the largest change in dose to the central axis (d = 5 cm). Source occlusion also caused a large change in output factor for field sizes less than 8 mm. Discussion and conclusions The measurement of cross-axis profiles are only required for output factor measurements for field sizes of 15 mm or less (for a 6 MV beam on Varian iX linear accelerator). This is expected to be dependent on linear accelerator spot size and photon energy. While some electronic disequilibrium was shown to occur at field sizes as large as 30 mm (the ‘traditional’ definition of small field [3]), it has been shown that it does not cause a greater change than photon scatter until a field size of 12 mm, at which point it becomes by far the most dominant effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim. This paper is a report of a development and validation of a new job performance scale based on an established job performance model. Background. Previous measures of nursing quality are atheoretical and fail to incorporate the complete range of behaviours performed. Thus, an up-to-date measure of job performance is required for assessing nursing quality. Methods. Test construction involved systematic generation of test items using focus groups, a literature review, and an expert review of test items. A pilot study was conducted to determine the multidimensional nature of the taxonomy and its psychometric properties. All data were collected in 2005. Findings. The final version of the nursing performance taxonomy included 41 behaviours across eight dimensions of job performance. Results from preliminary psychometric investigations suggest that the nursing performance scale has good internal consistency, good convergent validity and good criterion validity. Conclusion. The findings give preliminary support for a new job performance scale as a reliable and valid tool for assessing nursing quality. However, further research using a larger sample and nurses from a broader geographical region is required to cross-validate the measure. This scale may be used to guide hospital managers regarding the quality of nursing care within units and to guide future research in the area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Smartphone technology provides free or inexpensive access to mental health and wellbeing resources. As a result the use of mobile applications for these purposes has increased significantly in recent years. Yet, there is currently no app quality assessment alternative to the popular ‘star’-ratings, which are often unreliable. This presentation describes the development of the Mobile Application Rating Scale (MARS) a new measure for classifying and rating the quality of mobile applications. A review of existing literature on app and web quality identified 25 published papers, conference proceedings, and online resources (published since 1999), which identified 372 explicit quality criteria. Qualitative analysis identified five broad categories of app quality rating criteria: engagement, functionality, aesthetics, information quality, and overall satisfaction, which were refined into the 23-item MARS. Independent ratings of 50 randomly selected mental health and wellbeing mobile apps indicated the MARS had excellent levels of internal consistency (α = 0.92) and inter-rater reliability (ICC = 0.85). The MARS provides practitioners and researchers with an easy-to-use, simple, objective and reliable tool for assessing mobile app quality. It also provides mHealth professionals with a checklist for the design and development of high quality apps.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is considerable interest internationally in developing product libraries to support the use of BIM. Product library initiatives are driven by national bodies, manufacturers and private companies who see their potential. A major issue with the production and distribution of product information for BIM is that separate library objects need to be produced for all of the different software systems that are going to use the library. This increases the cost of populating product libraries and also increases the difficulty in maintaining consistency between the representations for the different software over time. This paper describes a project which uses “software transformation” technology from the field of software engineering to support the definition of a single generic representation of a product which can then be automatically converted to the format required by receiving software. The paper covers the current state of implementation of the product library, the technology underlying the transformations for the currently supported software and the business model for creating a national library in Australia. This is placed within the context of other current product library systems to highlight the differences. The responsibilities of the various actors involved in supporting the product library are also discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper provides a contextual reflection for understanding best practice teaching to first year design students. The outcome (job) focussed approach to higher education has lead to some unanticipated collateral damage for students, and in the case we discuss, has altered the students’ expectations of course delivery with specific implications and challenges for design educators. This tendency in educational delivery systems is further compounded by the distinct characteristics of Generation Y students within a classroom context. It is our belief that foundational design education must focus more on process than outcomes, and through this research with first year design students we analyse and raise questions relative to the curriculum for a Design and Creative Thinking course—in which students not only benefit from learning the theories and processes of design thinking, conceptualisation and creativity, but also are encouraged to see it as an essential tool for their education and development as designers. This study considers the challenges within a design environment; specifically, we address the need for process based learning in contrast to the outcome-focused approach taken by most students. With this approach, students simultaneously learn to be a designer and rethink their approach to “doing design”.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Creative occupations exist across the entire economy. The creative worker’s habitus cannot be discovered by looking only in film studios, games companies or artist’s garrets. Work practices, evolved through the traditions of the creative and performing arts, are now deployed to create new services and products across all sectors, to develop process innovations, and to change the distribution thereof. Yet the bulk of academic study of creative work (both functionalist and critical), as well as the content of higher/further professional education programs and everyday understanding of creative workers, focuses on one subset of the Creative Industries: those involved in the production of cultural goods or services (film, television, music etc.) for consumption by the general public. And further, the bulk of existing academic work focuses on those creative workers employed in cultural production industries. However, as recent work has shown, this focus misses both the large (and increasing) number of creative workers embedded in industries beyond the core Creative Industries (for example, manufacturing, banking, mining) and those creative workers and firms that supply services to business as well as to the general public, such as architects, technical writers, and graphic designers (see Cunningham 2013; Potts and Cunningham 2008; Potts, Cunningham, Hartley and Omerod 2008). This book focuses on this subset of very important, and yet under-recognized creative workers: embedded creative workers and providers of creative services into other sectors of the economy, as indicated in the following taxonomy (Figure 1.1), which juxtaposes occupation and industry sector...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a wide variety of drivers for business process modelling initiatives, reaching from business evolution and process optimisation over compliance checking and process certification to process enactment. That, in turn, results in models that differ in content due to serving different purposes. In particular, processes are modelled on different abstraction levels and assume different perspectives. Vertical alignment of process models aims at handling these deviations. While the advantages of such an alignment for inter-model analysis and change propagation are out of question, a number of challenges has still to be addressed. In this paper, we discuss three main challenges for vertical alignment in detail. Against this background, the potential application of techniques from the field of process integration is critically assessed. Based thereon, we identify specific research questions that guide the design of a framework for model alignment.