969 resultados para Composite Measurement Scales
Resumo:
Since users have become the focus of product/service design in last decade, the term User eXperience (UX) has been frequently used in the field of Human-Computer-Interaction (HCI). Research on UX facilitates a better understanding of the various aspects of the user’s interaction with the product or service. Mobile video, as a new and promising service and research field, has attracted great attention. Due to the significance of UX in the success of mobile video (Jordan, 2002), many researchers have centered on this area, examining users’ expectations, motivations, requirements, and usage context. As a result, many influencing factors have been explored (Buchinger, Kriglstein, Brandt & Hlavacs, 2011; Buchinger, Kriglstein & Hlavacs, 2009). However, a general framework for specific mobile video service is lacking for structuring such a great number of factors. To measure user experience of multimedia services such as mobile video, quality of experience (QoE) has recently become a prominent concept. In contrast to the traditionally used concept quality of service (QoS), QoE not only involves objectively measuring the delivered service but also takes into account user’s needs and desires when using the service, emphasizing the user’s overall acceptability on the service. Many QoE metrics are able to estimate the user perceived quality or acceptability of mobile video, but may be not enough accurate for the overall UX prediction due to the complexity of UX. Only a few frameworks of QoE have addressed more aspects of UX for mobile multimedia applications but need be transformed into practical measures. The challenge of optimizing UX remains adaptations to the resource constrains (e.g., network conditions, mobile device capabilities, and heterogeneous usage contexts) as well as meeting complicated user requirements (e.g., usage purposes and personal preferences). In this chapter, we investigate the existing important UX frameworks, compare their similarities and discuss some important features that fit in the mobile video service. Based on the previous research, we propose a simple UX framework for mobile video application by mapping a variety of influencing factors of UX upon a typical mobile video delivery system. Each component and its factors are explored with comprehensive literature reviews. The proposed framework may benefit in user-centred design of mobile video through taking a complete consideration of UX influences and in improvement of mobile videoservice quality by adjusting the values of certain factors to produce a positive user experience. It may also facilitate relative research in the way of locating important issues to study, clarifying research scopes, and setting up proper study procedures. We then review a great deal of research on UX measurement, including QoE metrics and QoE frameworks of mobile multimedia. Finally, we discuss how to achieve an optimal quality of user experience by focusing on the issues of various aspects of UX of mobile video. In the conclusion, we suggest some open issues for future study.
Resumo:
Purpose: The Cobb technique is the universally accepted method for measuring the severity of spinal deformities. Traditionally, Cobb angles have been measured using protractor and pencil on hardcopy radiographic films. The new generation of mobile phones make accurate angle measurement possible using an integrated accelerometer, providing a potentially useful clinical tool for assessing Cobb angles. The purpose of this study was to compare Cobb angle measurements performed using an Apple iPhone and traditional protractor in a series of twenty Adolescent Idiopathic Scoliosis patients. Methods: Seven observers measured major Cobb angles on twenty pre-operative postero-anterior radiographs of Adolescent Idiopathic Scoliosis patients with both a standard protractor and using an Apple iPhone. Five of the observers repeated the measurements at least a week after the original measurements. Results: The mean absolute difference between pairs of iPhone/protractor measurements was 2.1°, with a small (1°) bias toward lower Cobb angles with the iPhone. 95% confidence intervals for intra-observer variability were ±3.3° for the protractor and ±3.9° for the iPhone. 95% confidence intervals for inter-observer variability were ±8.3° for the iPhone and ±7.1° for the protractor. Both of these confidence intervals were within the range of previously published Cobb measurement studies. Conclusions: We conclude that the iPhone is an equivalent Cobb measurement tool to the manual protractor, and measurement times are about 15% less. The widespread availability of inclinometer-equipped mobile phones and the ability to store measurements in later versions of the angle measurement software may make these new technologies attractive for clinical measurement applications.
Study of the effectiveness of outrigger system for high-rise composite buildings for cyclonic region
Resumo:
The demands of taller structures are becoming imperative almost everywhere in the world in addition to the challenges of material and labor cost, project time line etc. This paper conducted a study keeping in view the challenging nature of high-rise construction with no generic rules for deflection minimizations and frequency control. The effects of cyclonic wind and provision of outriggers on 28-storey, 42-storey and 57-storey are examined in this paper and certain conclusions are made which would pave way for researchers to conduct further study in this particular area of civil engineering. The results show that plan dimensions have vital impacts on structural heights. Increase of height while keeping the plan dimensions same, leads to the reduction in the lateral rigidity. To achieve required stiffness increase of bracings sizes as well as introduction of additional lateral resisting system such as belt truss and outriggers is required.
Resumo:
Purpose: To assess the accuracy of intraocular pressure(IOP) measurements using rebound tonometry over disposable hydrogel (etafilcon A) and silicone hydrogel (senofilcon A) contact lenses (CLs) of different powers. Methods: The experimental group comprised 36 subjects (19 male, 17 female). IOP measurements were undertaken on the subject’s right eyes in random order using a rebound tonometer (ICare). The CLs had powers of +2.00D, −2.00D and−6.00D. Six measurements were taken over each contact lens and also before and after the CLs had been worn. Results: A good correlation was found between IOP measurements with and without CLs (all r≥0.80; p < 0.05). Bland Altman plots did not show any significant trend in the difference in IOP readings with and without CLs as a function of IOP value. A two-way ANOVA revealed a significant effect of material and power (p < 0.01) but no interaction. All the comparisons between the measurements without CLs and with hydrogel CLs were significant (p < 0.01). The comparisons with silicone hydrogel CLs were not significant. Conclusions: Rebound tonometry can be reliably performed over silicone hydrogel CLs. With hydrogel CLs, the measurements were lower than those without CLs. However, despite the fact that these differences were statistically significant, their clinical significance was minimal.
Resumo:
This paper reports on a mathematics project conducted with six Torres Strait Islander schools and communities by the research team at the YuMi Deadly Centre at QUT. Data collected is from a small focus group of six teachers and two teacher aides. We investigated how measurement is taught and learned by students, their teachers and teacher aides in the community schools. A key focus of the project was that the teaching and learning of measurement be contextualised to the students’ culture, community and home languages. A significant finding from the project was that the teachers had differing levels of knowledge and understanding about how to contextualise measurement to support student learning. For example, an Indigenous teacher identified that mathematics and the environment are relational, that is, they are not discrete and in isolation from one another, rather they mesh together, thus affording the articulation and interchange among and between mathematics and Torres Strait Islander culture.
Resumo:
Purpose. To devise and validate artist-rendered grading scales for contact lens complications Methods. Each of eight tissue complications of contact lens wear (listed under 'Results') was painted by a skilled ophthalmic artist (Terry R. Tarrant) in five grades of severity: 0 (normal), 1 (trace), 2 (mild), 3 (moderate) and 4 (severe). A representative slit lamp photograph of a tissue response of each of the eight complications was shown to 404 contact lens practitioners who had never before used clinical grading scales. The practitioners were asked to grade each tissue response to the nearest 0.1 grade unit by interpolation. Results. The standard deviation (± s.d.) of the 404 responses for each tissue complication is tabulated below:_ing_ 0.5 Endothelial pplymegethisjij-4 0.7 Epithelial microcysts 0.5 Endothelial blebs_ 0.4 Stromal edema_onjunctiva! hyperemia 0.4 Stromal neovascularization 0.4 Papillary conjunctivitis 0.5 The frequency distributions and best-fit normal curves were also plotted. The precision of grading (s.d. x 2) ranged from 0.8 to 1.4, with a mean precision of 1.0. Conclusions. Grading scales afford contact lens practitioners with a method of quantifying the severity of adverse tissue responses to contact lens wear. It is noteworthy that the statistically verified precision of grading (1.0 scale unit) concurs precisely with the essential design feature of the grading scales that each grading step of 1.0 corresponds to clinically significant difference in severity. Thus, as a general rule, a difference or change in grade of > 1.0 can be taken to be both clinically and statistically significant when using these grading scales. Trained observers are likely to achieve even greater grading precision. Supported by Hydron Limited.
Resumo:
This paper discusses the research carried out towards the development of a hybrid-composite floor plate systems (HCFPS) using polyurethane (PU), glass-fibre reinforced cement (GRC) and thin perforated steel laminate. HCFPS is configured in such a way where positive inherent properties of individual component materials are combined to offset any weakness and achieve the optimum performance. Finite Element modeling of HCFPS with ABAQUS 6.9-1, comparative studies of HCFPS with the steel deck composite system and experimental investigations which will be carried out are briefly described in the paper.
Resumo:
Performance based planning is a form of planning regulation that is not well understood and the theoretical advantages of this type of planning are rarely achieved in practice. Normatively, this type of regulation relies on performance standards that are quantifiable and technically based which are designed to manage the effects of development, where performance standards provide certainty in respect of the level of performance and the means of achievement is flexible. Few empirical studies have attempted to examine how performance based planning has been conceptualised and implemented in practice. Existing literature is predominately anecdotal and consultant based (Baker et al. 2006) and has not sought to quantitatively examine how land use has been managed or determine how context influences implementation. The Integrated Planning Act 1997 (IPA) operated as Queensland’s principal planning legislation between March 1998 and December 2009. The IPA prevented Local Governments from prohibiting development or use and the term zone was absent from the legislation. While the IPA did not use the term performance based planning, the system is widely considered to be performance based in practice (e.g. Baker et al. 2006; Steele 2009a, 2009b). However, the degree to which the IPA and the planning system in Queensland is performance based is debated (e.g. Yearbury 1998; England 2004). Four research questions guided the research framework using Queensland as the case study. The questions sought to: determine if there is a common understanding of performance based planning; identify how performance based planning was expressed under the IPA; understand how performance based planning was implemented in plans; and explore the experiences of participants in the planning system. The research developed a performance adoption spectrum. The spectrum describes how performance based planning is implemented, ranging between pure and hybrid interpretations. An ex-post evaluation of seventeen IPA plans sought to determine plan performativity within the conceptual spectrum. Land use was examined from the procedural dimension of performance (Assessment Tables) and the substantive dimension of performance (Codes). A documentary analysis and forty one interviews supplemented the research. The analytical framework considered how context influenced performance based planning, including whether: the location of the local government affected land use management techniques; temporal variation in implementation exists; plan-making guidelines affected implementation; different perceptions of the concept exist; this type of planning applies to a range of spatial scales. Outcomes were viewed as the medium for determining the acceptability of development in Queensland, a significant departure from pure approaches found in the United States. Interviews highlighted the absence of plan-making direction in the IPA, which contributed to the confusion about the intended direction of the planning system and the myth that the IPA would guarantee a performance based system. A hybridised form of performance based planning evolved in Queensland which was dependent on prescriptive land use zones and specification of land use type, with some local governments going to extreme lengths to discourage certain activities in a predetermined manner. Context had varying degrees of influence on plan-making methods. Decision-making was found to be inconsistent and the system created a range of unforeseen consequences including difficulties associated with land valuation, increased development speculation, and the role of planners in court was found to be less critical than in the previous planning system.
Resumo:
The concept of local accumulation time (LAT) was introduced by Berezhkovskii and coworkers in 2010–2011 to give a finite measure of the time required for the transient solution of a reaction–diffusion equation to approach the steady–state solution (Biophys J. 99, L59 (2010); Phys Rev E. 83, 051906 (2011)). Such a measure is referred to as a critical time. Here, we show that LAT is, in fact, identical to the concept of mean action time (MAT) that was first introduced by McNabb in 1991 (IMA J Appl Math. 47, 193 (1991)). Although McNabb’s initial argument was motivated by considering the mean particle lifetime (MPLT) for a linear death process, he applied the ideas to study diffusion. We extend the work of these authors by deriving expressions for the MAT for a general one–dimensional linear advection–diffusion–reaction problem. Using a combination of continuum and discrete approaches, we show that MAT and MPLT are equivalent for certain uniform–to-uniform transitions; these results provide a practical interpretation for MAT, by directly linking the stochastic microscopic processes to a meaningful macroscopic timescale. We find that for more general transitions, the equivalence between MAT and MPLT does not hold. Unlike other critical time definitions, we show that it is possible to evaluate the MAT without solving the underlying partial differential equation (pde). This makes MAT a simple and attractive quantity for practical situations. Finally, our work explores the accuracy of certain approximations derived using the MAT, showing that useful approximations for nonlinear kinetic processes can be obtained, again without treating the governing pde directly.
Resumo:
Commonwealth Scientific and Industrial Research Organization (CSIRO) has recently conducted a technology demonstration of a novel fixed wireless broadband access system in rural Australia. The system is based on multi user multiple-input multiple-output orthogonal frequency division multiplexing (MU-MIMO-OFDM). It demonstrated an uplink of six simultaneous users with distances ranging from 10 m to 8.5 km from a central tower, achieving 20 bits s/Hz spectrum efficiency. This paper reports on the analysis of channel capacity and bit error probability simulation based on the measured MUMIMO-OFDM channels obtained during the demonstration, and their comparison with the results based on channels simulated by a novel geometric optics based channel model suitable for MU-MIMO OFDM in rural areas. Despite its simplicity, the model was found to predict channel capacity and bit error rate probability accurately for a typical MU-MIMO-OFDM deployment scenario.
Resumo:
Study Design. A sheep study designed to compare the accuracy of static radiographs, dynamic radiographs, and computed tomographic (CT) scans for the assessment of thoracolumbar facet joint fusion as determined by micro-CT scanning. Objective. To determine the accuracy and reliability of conventional imaging techniques in identifying the status of thoracolumbar (T13-L1) facet joint fusion in a sheep model. Summary of Background Data. Plain radiographs are commonly used to determine the integrity of surgical arthrodesis of the thoracolumbar spine. Many previous studies of fusion success have relied solely on postoperative assessment of plain radiographs, a technique lacking sensitivity for pseudarthrosis. CT may be a more reliable technique, but is less well characterized. Methods. Eleven adult sheep were randomized to either attempted arthrodesis using autogenous bone graft and internal fixation (n = 3) or intentional pseudarthrosis (IP) using oxidized cellulose and internal fixation (n = 8). After 6 months, facet joint fusion was assessed by independent observers, using (1) plain static radiography alone, (2) additional dynamic radiographs, and (3) additional reconstructed spiral CT imaging. These assessments were correlated with high-resolution micro-CT imaging to predict the utility of the conventional imaging techniques in the estimation of fusion success. Results. The capacity of plain radiography alone to correctly predict fusion or pseudarthrosis was 43% and was not improved using plain radiography and dynamic radiography with also a 43% accuracy. Adding assessment by reformatted CT imaging to the plain radiography techniques increased the capacity to predict fusion outcome to 86% correctly. The sensitivity, specificity, and accuracy of static radiography were 0.33, 0.55, and 0.43, respectively, those of dynamic radiography were 0.46, 0.40, and 0.43, respectively, and those of radiography plus CT were 0.88, 0.85, and 0.86, respectively. Conclusion. CT-based evaluation correlated most closely with high-resolution micro-CT imaging. Neither plain static nor dynamic radiographs were able to predict fusion outcome accurately. © 2012 Lippincott Williams & Wilkins.
Resumo:
Navigational safety analysis relying on collision statistics is often hampered because of low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possesses great potential for managing collision risks in port waters.
Resumo:
Enterprise architecture (EA) management has become an intensively discussed approach to manage enterprise transformations. Despite the popularity and potential of EA, both researchers and practitioners lament a lack of knowledge about the realization of benefits from EA. To determine the benefits from EA, we explore the various dimensions of EA benefit realization and report on the development of a validated and robust measurement instrument. In this paper, we test the reliability and construct validity of the EA benefit realization model (EABRM), which we have designed based on the DeLone & McLean IS success model and findings from exploratory interviews. A confirmatory factor analysis confirms the existence of an impact of five distinct and individually important dimensions on the benefits derived from EA: EA artefact quality, EA infrastructure quality, EA service quality, EA culture, and EA use. The analysis presented in this paper shows that the EA benefit realization model is an instrument that demonstrates strong reliability and validity.
Resumo:
Qualitative and quantitative measurements of biomass components dissolved in the phosphonium ionic liquids (ILs), trihexyltetradecylphosphonium chloride ([P66614]Cl) and tributylmethylphosphonium methylsulphate ([P4441]MeSO 4), are obtained using attenuated total reflectance-FTIR. Absorption bands related to cellulose, hemicelluloses, and lignin dissolution monitored in situ in biomass-IL mixtures indicate lignin dissolution in both ILs and some holocellulose dissolution in the hydrophilic [P4441]MeSO 4. The kinetics of lignin dissolution reported here indicate that while dissolution in the hydrophobic IL [P66614]Cl appears to follow an accepted mechanism of acid catalyzed -aryl ether cleavage, dissolution in the hydrophilic IL [P4441]MeSO 4 does not appear to follow this mechanism and may not be followed by condensation reactions (initiated by reactive ketones). The measurement of lignin dissolution in phosphonium ILs based on absorbance at 1510 cm 1 has demonstrated utility. When coupled with the gravimetric Klason lignin method, ATR-FTIR study of reaction mixtures can lead to a better understanding of the delignification process. © 2012 Copyright Taylor and Francis Group, LLC.
Resumo:
Background: The accurate evaluation of physical activity levels amongst youth is critical for quantifying physical activity behaviors and evaluating the effect of physical activity interventions. The purpose of this review is to evaluate contemporary approaches to physical activity evaluation amongst youth. Data sources: The literature from a range of sources was reviewed and synthesized to provide an overview of contemporary approaches for measuring youth physical activity. Results: Five broad categories are described: self-report, instrumental movement detection, biological approaches, direct observation, and combined methods. Emerging technologies and priorities for future research are also identified. Conclusions: There will always be a trade-off between accuracy and available resources when choosing the best approach for measuring physical activity amongst youth. Unfortunately, cost and logistical challenges may prohibit the use of "gold standard" physical activity measurement approaches such as doubly labelled water. Other objective methods such as heart rate monitoring, accelerometry, pedometry, indirect calorimetry, or a combination of measures have the potential to better capture the duration and intensity of physical activity, while self-reported measures are useful for capturing the type and context of activity.