987 resultados para Measurement degree
Resumo:
Por medio de la literatura correspondiente a la medición de competencias, así como con los comportamientos apreciados de manera empírica a través del análisis de empresarios emprendedores exitosos y conforme a la teoría de los expertos en el tema de emprendimiento, se ha elaborado una herramienta de medición de competencias para empresarios emprendedores (creadores de empresa) de la ciudad de Bogotá. Dicha herramienta se usará en la medición de empresarios que hayan creado empresa entre el 2002 y el 2005 y que tengan el mayor crecimiento en venta anuales durante la última mitad de la última década. Las empresas hacen parte del sector industrial. Para ello inicialmente se hizo una revisión de competencias, basada en la trilogía de Martha Alles compuesta por los siguientes libros: Diccionario de Competencias, Diccionario de Comportamientos y Diccionario de Preguntas. Con la trilogía se logra la definición de competencias y la metodología para la medición de las mismas. Se siguió con una revisión biográfica de dos grandes empresarios bogotanos, que nacieron en Bogotá o que tienen el suficiente tiempo en Bogotá para ser considerados bogotanos, de la cual se extrajeron comportamientos específicos. Haciendo uso de la autoridad que tiene en el tema de emprendimiento Amar Bhide, se hizo una revisión de los comportamientos esenciales de todo emprendedor de acuerdo con sus conceptos al respecto. Para finalizar, se creó un modelo general de competencias basado en las que están más desarrolladas entre los empresarios encuestados y entrevistados.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
The primary aims of scoliosis surgery are to halt the progression of the deformity, and to reduce its severity (cosmesis). Currently, deformity correction is measured in terms of posterior parameters (Cobb angles and rib hump), even though the cosmetic concern for most patients is anterior chest wall deformity. In this study, we propose a new measure for assessing anterior chest wall deformity and examine the correlation between rib hump and the new measure. 22 sets of CT scans were retrieved from the QUT/Mater Paediatric Spinal Research Database. The Image J software (NIH) was used to manipulate formatted CT scans into 3-dimensional anterior chest wall reconstructions. A ‘chest wall angle’ was then measured in relation to the first sacral vertebral body. The chest wall angle was found to be a reliable tool in the analysis of chest wall deformity. No correlation was found between the new measure and rib hump angle. Since rib hump has been shown to correlate with vertebral rotation on CT, this suggests that there maybe no correlation between anterior and posterior deformity measures. While most surgical procedures will adequately address the coronal imbalance & posterior rib hump elements of scoliosis, they do not reliably alter the anterior chest wall shape. This implies that anterior chest wall deformity is to a large degree an intrinsic deformity, not directly related to vertebral rotation.
Resumo:
The measurement of Cobb angles from radiographs is routine practice in spinal clinics. The technique relies on the use and availability of specialist equipment such as a goniometer, cobbometer or protractor. The aim of this study was to validate the use of i-Phone (Apple Inc) combined with Tilt Meter Pro software as compared to a protractor in the measurement of Cobb angles. Between November 2008 and December 2008 20 patients were selected at random from the Paediatric Spine Research Groups Database. A power calculation was performed which indicated if n=240 measurements the study had a 96% chance of detecting a 5 degree difference between groups. All patients had idiopathic scoliosis with a range of curve types and severities. The study found the i-Phone combined with Tilt Meter Pro software offers a faster alternative to the traditional method of Cobb angle measurement. The use of i-Phone offers a more convenient way of measuring Cobb angles in the outpatient setting. The intra-observer repeatability of the iPhone is equivalent to the protractor in the measurement of Cobb angles.
Resumo:
DeLone and McLean (1992, p. 16) argue that the concept of “system use” has suffered from a “too simplistic definition.” Despite decades of substantial research on system use, the concept is yet to receive strong theoretical scrutiny. Many measures of system use and the development of measures have been often idiosyncratic and lack credibility or comparability. This paper reviews various attempts at conceptualization and measurement of system use and then proposes a re-conceptualization of it as “the level of incorporation of an information system within a user’s processes.” The definition is supported with the theory of work systems, system, and Key-User-Group considerations. We then go on to develop the concept of a Functional- Interface-Point (FIP) and four dimensions of system usage: extent, the proportion of the FIPs used by the business process; frequency, the rate at which FIPs are used by the participants in the process; thoroughness, the level of use of information/functionality provided by the system at an FIP; and attitude towards use, a set of measures that assess the level of comfort, degree of respect and the challenges set forth by the system. The paper argues that the automation level, the proportion of the business process encoded by the information system has a mediating impact on system use. The article concludes with a discussion of some implications of this re-conceptualization and areas for follow on research.
Resumo:
Data quality has become a major concern for organisations. The rapid growth in the size and technology of a databases and data warehouses has brought significant advantages in accessing, storing, and retrieving information. At the same time, great challenges arise with rapid data throughput and heterogeneous accesses in terms of maintaining high data quality. Yet, despite the importance of data quality, literature has usually condensed data quality into detecting and correcting poor data such as outliers, incomplete or inaccurate values. As a result, organisations are unable to efficiently and effectively assess data quality. Having an accurate and proper data quality assessment method will enable users to benchmark their systems and monitor their improvement. This paper introduces a granules mining for measuring the random degree of error data which will enable decision makers to conduct accurate quality assessment and allocate the most severe data, thereby providing an accurate estimation of human and financial resources for conducting quality improvement tasks.
Resumo:
BACKGROUND Measurement of the global burden of disease with disability-adjusted life-years (DALYs) requires disability weights that quantify health losses for all non-fatal consequences of disease and injury. There has been extensive debate about a range of conceptual and methodological issues concerning the definition and measurement of these weights. Our primary objective was a comprehensive re-estimation of disability weights for the Global Burden of Disease Study 2010 through a large-scale empirical investigation in which judgments about health losses associated with many causes of disease and injury were elicited from the general public in diverse communities through a new, standardised approach. METHODS We surveyed respondents in two ways: household surveys of adults aged 18 years or older (face-to-face interviews in Bangladesh, Indonesia, Peru, and Tanzania; telephone interviews in the USA) between Oct 28, 2009, and June 23, 2010; and an open-access web-based survey between July 26, 2010, and May 16, 2011. The surveys used paired comparison questions, in which respondents considered two hypothetical individuals with different, randomly selected health states and indicated which person they regarded as healthier. The web survey added questions about population health equivalence, which compared the overall health benefits of different life-saving or disease-prevention programmes. We analysed paired comparison responses with probit regression analysis on all 220 unique states in the study. We used results from the population health equivalence responses to anchor the results from the paired comparisons on the disability weight scale from 0 (implying no loss of health) to 1 (implying a health loss equivalent to death). Additionally, we compared new disability weights with those used in WHO's most recent update of the Global Burden of Disease Study for 2004. FINDINGS 13,902 individuals participated in household surveys and 16,328 in the web survey. Analysis of paired comparison responses indicated a high degree of consistency across surveys: correlations between individual survey results and results from analysis of the pooled dataset were 0·9 or higher in all surveys except in Bangladesh (r=0·75). Most of the 220 disability weights were located on the mild end of the severity scale, with 58 (26%) having weights below 0·05. Five (11%) states had weights below 0·01, such as mild anaemia, mild hearing or vision loss, and secondary infertility. The health states with the highest disability weights were acute schizophrenia (0·76) and severe multiple sclerosis (0·71). We identified a broad pattern of agreement between the old and new weights (r=0·70), particularly in the moderate-to-severe range. However, in the mild range below 0·2, many states had significantly lower weights in our study than previously. INTERPRETATION This study represents the most extensive empirical effort as yet to measure disability weights. By contrast with the popular hypothesis that disability assessments vary widely across samples with different cultural environments, we have reported strong evidence of highly consistent results.
Resumo:
The characterisation of facial expression through landmark-based analysis methods such as FACEM (Pilowsky & Katsikitis, 1994) has a variety of uses in psychiatric and psychological research. In these systems, important structural relationships are extracted from images of facial expressions by the analysis of a pre-defined set of feature points. These relationship measures may then be used, for instance, to assess the degree of variability and similarity between different facial expressions of emotion. FaceXpress is a multimedia software suite that provides a generalised workbench for landmark-based facial emotion analysis and stimulus manipulation. It is a flexible tool that is designed to be specialised at runtime by the user. While FaceXpress has been used to implement the FACEM process, it can also be configured to support any other similar, arbitrary system for quantifying human facial emotion. FaceXpress also implements an integrated set of image processing tools and specialised tools for facial expression stimulus production including facial morphing routines and the generation of expression-representative line drawings from photographs.
Exploring variation in measurement as a foundation for statistical thinking in the elementary school
Resumo:
This study was based on the premise that variation is the foundation of statistics and statistical investigations. The study followed the development of fourth-grade students' understanding of variation through participation in a sequence of two lessons based on measurement. In the first lesson all students measured the arm span of one student, revealing pathways students follow in developing understanding of variation and linear measurement (related to research question 1). In the second lesson each student's arm span was measured once, introducing a different aspect of variation for students to observe and contrast. From this second lesson, students' development of the ability to compare their representations for the two scenarios and explain differences in terms of variation was explored (research question 2). Students' documentation, in both workbook and software formats, enabled us to monitor their engagement and identify their increasing appreciation of the need to observe, represent, and contrast the variation in the data. Following the lessons, a written student assessment was used for judging retention of understanding of variation developed through the lessons and the degree of transfer of understanding to a different scenario (research question 3).
Resumo:
Purpose To investigate if the accuracy of intraocular pressure (IOP) measurements using rebound tonometry over disposable hydrogel (etafilcon A) contact lenses (CL) is affected by the positive power of the CLs. Methods The experimental group comprised 26 subjects, (8 male, 18 female). IOP measurements were undertaken on the subjects’ right eyes in random order using a Rebound Tonometer (ICare). The CLs had powers of +2.00D and +6.00D. Measurements were taken over each contact lens and also before and after the CLs had been worn. Results The IOP measure obtained with both CLs was significantly lower compared to the value without CLs (t test; p<0.001) but no significant difference was found between the two powers of CLs. Conclusions Rebound tonometry over positive hydrogel CLs leads to a certain degree of IOP underestimation. This result didn’t change for the two positive lenses used in the experiment, despite their large difference in power and therefore in lens thickness. Optometrists should bear this in mind when measuring IOP with the rebound tonometer over plus power contact lenses.
Resumo:
Aims: The aims of this study were 1) to identify and describe health economic studies that have used quality-adjusted life years (QALYs) based on actual measurements of patients' health-related quality of life (HRQoL); 2) to test the feasibility of routine collection of health-related quality of life (HRQoL) data as an indicator of effectiveness of secondary health care; and 3) to establish and compare the cost-utility of three large-volume surgical procedures in a real-world setting in the Helsinki University Central Hospital, a large referral hospital providing secondary and tertiary health-care services for a population of approximately 1.4 million. Patients and methods: So as to identify studies that have used QALYs as an outcome measure, a systematic search of the literature was performed using the Medline, Embase, CINAHL, SCI and Cochrane Library electronic databases. Initial screening of the identified articles involved two reviewers independently reading the abstracts; the full-text articles were also evaluated independently by two reviewers, with a third reviewer used in cases where the two reviewers could not agree a consensus on which articles should be included. The feasibility of routinely evaluating the cost-effectiveness of secondary health care was tested by setting up a system for collecting HRQoL data on approximately 4 900 patients' HRQoL before and after operative treatments performed in the hospital. The HRQoL data used as an indicator of treatment effectiveness was combined with diagnostic and financial indicators routinely collected in the hospital. To compare the cost-effectiveness of three surgical interventions, 712 patients admitted for routine operative treatment completed the 15D HRQoL questionnaire before and also 3-12 months after the operation. QALYs were calculated using the obtained utility data and expected remaining life years of the patients. Direct hospital costs were obtained from the clinical patient administration database of the hospital and a cost-utility analysis was performed from the perspective of the provider of secondary health care services. Main results: The systematic review (Study I) showed that although QALYs gained are considered an important measure of the effectiveness of health care, the number of studies in which QALYs are based on actual measurements of patients' HRQoL is still fairly limited. Of the reviewed full-text articles, only 70 reported QALYs based on actual before after measurements using a valid HRQoL instrument. Collection of simple cost-effectiveness data in secondary health care is feasible and could easily be expanded and performed on a routine basis (Study II). It allows meaningful comparisons between various treatments and provides a means for allocating limited health care resources. The cost per QALY gained was 2 770 for cervical operations and 1 740 for lumbar operations. In cases where surgery was delayed the cost per QALY was doubled (Study III). The cost per QALY ranges between subgroups in cataract surgery (Study IV). The cost per QALY gained was 5 130 for patients having both eyes operated on and 8 210 for patients with only one eye operated on during the 6-month follow-up. In patients whose first eye had been operated on previous to the study period, the mean HRQoL deteriorated after surgery, thus precluding the establishment of the cost per QALY. In arthroplasty patients (Study V) the mean cost per QALY gained in a one-year period was 6 710 for primary hip replacement, 52 270 for revision hip replacement, and 14 000 for primary knee replacement. Conclusions: Although the importance of cost-utility analyses has during recent years been stressed, there are only a limited number of studies in which the evaluation is based on patients own assessment of the treatment effectiveness. Most of the cost-effectiveness and cost-utility analyses are based on modeling that employs expert opinion regarding the outcome of treatment, not on patient-derived assessments. Routine collection of effectiveness information from patients entering treatment in secondary health care turned out to be easy enough and did not, for instance, require additional personnel on the wards in which the study was executed. The mean patient response rate was more than 70 %, suggesting that patients were happy to participate and appreciated the fact that the hospital showed an interest in their well-being even after the actual treatment episode had ended. Spinal surgery leads to a statistically significant and clinically important improvement in HRQoL. The cost per QALY gained was reasonable, at less than half of that observed for instance for hip replacement surgery. However, prolonged waiting for an operation approximately doubled the cost per QALY gained from the surgical intervention. The mean utility gain following routine cataract surgery in a real world setting was relatively small and confined mostly to patients who had had both eyes operated on. The cost of cataract surgery per QALY gained was higher than previously reported and was associated with considerable degree of uncertainty. Hip and knee replacement both improve HRQoL. The cost per QALY gained from knee replacement is two-fold compared to hip replacement. Cost-utility results from the three studied specialties showed that there is great variation in the cost-utility of surgical interventions performed in a real-world setting even when only common, widely accepted interventions are considered. However, the cost per QALY of all the studied interventions, except for revision hip arthroplasty, was well below 50 000, this figure being sometimes cited in the literature as a threshold level for the cost-effectiveness of an intervention. Based on the present study it may be concluded that routine evaluation of the cost-utility of secondary health care is feasible and produces information essential for a rational and balanced allocation of scarce health care resources.
Resumo:
Controlling the morphological structure of titanium dioxide (TiO 2) is crucial for obtaining superior power conversion efficiency for dye-sensitized solar cells. Although the sol-gel-based process has been developed for this purpose, there has been limited success in resisting the aggregation of nanostructured TiO2, which could act as an obstacle for mass production. Herein, we report a simple approach to improve the efficiency of dye-sensitized solar cells (DSSC) by controlling the degree of aggregation and particle surface charge through zeta potential analysis. We found that different aqueous colloidal conditions, i.e., potential of hydrogen (pH), water/titanium alkoxide (titanium isopropoxide) ratio, and surface charge, obviously led to different particle sizes in the range of 10-500 nm. We have also shown that particles prepared under acidic conditions are more effective for DSSC application regarding the modification of surface charges to improve dye loading and electron injection rate properties. Power conversion efficiency of 6.54%, open-circuit voltage of 0.73 V, short-circuit current density of 15.32 mA/cm2, and fill factor of 0.73 were obtained using anatase TiO 2 optimized to 10-20 nm in size, as well as by the use of a compact TiO2 blocking layer.
Resumo:
The purpose of this article is to report the experience of design and testing of orifice plate-based flow measuring systems for evaluation of air leakages in components of air conditioning systems. Two of the flow measuring stations were designed with a beta value of 0.405 and 0.418. The third was a dual path unit with orifice plates of beta value 0.613 and 0.525. The flow rates covered with all the four were from 4-94 l/s and the range of Reynolds numbers is from 5600 to 76,000. The coefficients of discharge were evaluated and compared with the Stolz equation. Measured C-d values are generally higher than those obtained from the equation, the deviations being larger in the low Reynolds number region. Further, it is observed that a second-degree polynomial is inadequate to relate the pressure drop and flow rate. The lower Reynolds number limits set by standards appear to be somewhat conservative.
Resumo:
The present work presents the results of experimental investigation of semi-solid rheocasting of A356 Al alloy using a cooling slope. The experiments have been carried out following Taguchi method of parameter design (orthogonal array of L-9 experiments). Four key process variables (slope angle, pouring temperature, wall temperature, and length of travel of the melt) at three different levels have been considered for the present experimentation. Regression analysis and analysis of variance (ANOVA) has also been performed to develop a mathematical model for degree of sphericity evolution of primary alpha-Al phase and to find the significance and percentage contribution of each process variable towards the final outcome of degree of sphericity, respectively. The best processing condition has been identified for optimum degree of sphericity (0.83) as A(3), B-3, C-2, D-1 i.e., slope angle of 60 degrees, pouring temperature of 650 degrees C, wall temperature 60 degrees C, and 500 mm length of travel of the melt, based on mean response and signal to noise ratio (SNR). ANOVA results shows that the length of travel has maximum impact on degree of sphericity evolution. The predicted sphericity obtained from the developed regression model and the values obtained experimentally are found to be in good agreement with each other. The sphericity values obtained from confirmation experiment, performed at 95% confidence level, ensures that the optimum result is correct and also the confirmation experiment values are within permissible limits. (c) 2014 Elsevier Ltd. All rights reserved.
Resumo:
Thermally induced recovery of nanoindents in a CUAINi single crystal shape memory alloy was studied by nanoindentation in conjunction with a heating stage. Nanoindents formed by a Berkovich indenter at room temperature were heated to 40, 70 and 100 degrees C. Partial recovery was observed for the nanoindents. The recovery ratio depended on the heating temperature. Indentation of CuAlNi can induce inelastic deformation via dislocation motion and a stress-induced matensitic transformation. The percentages of dislocation-induced plastic strain would affect the thermal deformation of CuAlNi, because the induced dislocations could stabilize stress-induced martensite plates even when the temperature above austenite finish temperature, A(f). When the applied indentation load is low (less than 10,000 mu N), the shape recovery strain is predominant, compared with the dislocation-induced plastic strain. Therefore, the degree of indent recovery in the depth direction, delta(D), is high (about 0.7-0.8 at 100 degrees C).