921 resultados para test development
Resumo:
The purpose of the present study is to test the case linkage principles of behavioural consistency and behavioural distinctiveness using serial vehicle theft data. Data from 386 solved vehicle thefts committed by 193 offenders were analysed using Jaccard's, regression and Receiver Operating Characteristic analyses to determine whether objectively observable aspects of crime scene behaviour could be used to distinguish crimes committed by the same offender from those committed by different offenders. The findings indicate that spatial behaviour, specifically the distance between theft locations and between dump locations, is a highly consistent and distinctive aspect of vehicle theft behaviour; thus, intercrime and interdump distance represent the most useful aspects of vehicle theft for the purpose of case linkage analysis. The findings have theoretical and practical implications for understanding of criminal behaviour and for the development of decision-support tools to assist police investigation and apprehension of serial vehicle theft offenders.
Resumo:
CONTEXT: The homeless are a significant group within society, which is increasing in size. They have demonstrably greater physical and mental health needs than the housed, and yet often have difficulty accessing primary health care. Medical 'reluctance' to look after homeless people is increasingly suggested as part of the problem. Medical education may have a role in ameliorating this. OBJECTIVES: This paper reports on the development and validation of a questionnaire specifically developed to measure medical students' attitudes towards the homeless. METHOD AND RESULTS: The Attitudes Towards the Homeless Questionnaire, developed using the views of over 370 medical students, was shown to have a Pearson test-retest reliability correlation coefficient of 0.8 and a Cronbach's alpha coefficient of 0.74. CONCLUSIONS: The Attitudes Towards the Homeless Questionnaire appears to be a valid and reliable instrument, which can measure students' attitudes towards the homeless. It could be a useful tool in assessing the effectiveness of educational interventions.
Resumo:
The two areas of theory upon which this research was based were „strategy development process?(SDP) and „complex adaptive systems? (CAS), as part of complexity theory, focused on human social organisations. The literature reviewed showed that there is a paucity of empirical work and theory in the overlap of the two areas, providing an opportunity for contributions to knowledge in each area of theory, and for practitioners. An inductive approach was adopted for this research, in an effort to discover new insights to the focus area of study. It was undertaken from within an interpretivist paradigm, and based on a novel conceptual framework. The organisationally intimate nature of the research topic, and the researcher?s circumstances required a research design that was both in-depth and long term. The result was a single, exploratory, case study, which included use of data from 44 in-depth, semi-structured interviews, from 36 people, involving all the top management team members and significant other staff members; observations, rumour and grapevine (ORG) data; and archive data, over a 5½ year period (2005 – 2010). Findings confirm the validity of the conceptual framework, and that complex adaptive systems theory has potential to extend strategy development process theory. It has shown how and why the strategy process developed in the case study organisation by providing deeper insights to the behaviour of the people, their backgrounds, and interactions. Broad predictions of the „latent strategy development? process and some elements of the strategy content are also possible. Based on this research, it is possible to extend the utility of the SDP model by including peoples? behavioural characteristics within the organisation, via complex adaptive systems theory. Further research is recommended to test limits of the application of the conceptual framework and improve its efficacy with more organisations across a variety of sectors.
Resumo:
Purpose: To develop a questionnaire that subjectively assesses near visual function in patients with 'accommodating' intraocular lenses (IOLs). Methods: A literature search of existing vision-related quality-of-life instruments identified all questions relating to near visual tasks. Questions were combined if repeated in multiple instruments. Further relevant questions were added and item interpretation confirmed through multidisciplinary consultation and focus groups. A preliminary 19-item questionnaire was presented to 22 subjects at their 4-week visit post first eye phacoemulsification with 'accommodative' IOL implantation, and again 6 and 12 weeks post-operatively. Rasch Analysis, Frequency of Endorsement, and tests of normality (skew and kurtosis) were used to reduce the instrument. Cronbach's alpha and test-retest reliability (intraclass correlation coefficient, ICC) were determined for the final questionnaire. Construct validity was obtained by Pearson's product moment correlation (PPMC) of questionnaire scores to reading acuity (RA) and to Critical Print Size (CPS) reading speed. Criterion validity was obtained by receiver operating characteristic (ROC) curve analysis and dimensionality of the questionnaire was assessed by factor analysis. Results: Rasch Analysis eliminated nine items due to poor fit statistics. The final items have good separation (2.55), internal consistency (Cronbach's α = 0.97) and test-retest reliability (ICC = 0.66). PPMC of questionnaire scores with RA was 0.33, and with CPS reading speed was 0.08. Area under the ROC curve was 0.88 and Factor Analysis revealed one principal factor. Conclusion: The pilot data indicates the questionnaire to be internally consistent, reliable and a valid instrument that could be useful for assessing near visual function in patients with 'accommodating' IOLS. The questionnaire will now be expanded to include other types of presbyopic correction. © 2007 British Contact Lens Association.
Resumo:
Increased awareness of the crucial role of leadership as a competitive advantage for organisations (McCall, 1998; Petrick, Scherer, Brodzinski, Quinn, & Ainina, 1999) has led to billions spent on leadership development programmes and training (Avolio & Hannah, 2008). However, research reports confusing and contradictory evidence regarding return on investment and developmental outcomes, and a lot of variance has been observed across studies (Avolio, Reichard, Hannah, Walumbwa, & Chan, 2009). The purpose of this thesis is to understand the mechanisms underlying this variability in leadership development. Of the many factors at play in the process, such as programme design and delivery, organisational support, and perceptions of relevance (Mabey, 2002; Day, Harrison, & Halpin, 2009), individual differences and characteristics stand out. One way in which individuals differ is in their Developmental Readiness (DR), a concept recently introduced in the literature that may well explain this variance and which has been proposed to accelerate development (Avolio & Hannah, 2008, 2009). Building on previous work, DR is introduced and conceptualised somewhat differently. In this study, DR is construed of self-awareness, self-regulation, and self-motivation, proposed by Day (2000) to be the backbones of leadership development. DR is suggested to moderate the developmental process. Furthermore, personality dispositions and individual values are proposed to be precursors of DR. The empirical research conducted uses a pre-test post-test quasi-experimental design. Before conducting the study, though, both a measure of Developmental Readiness and a competency profiling measure are tested in two pilot studies. Results do not find evidence of a direct effect of leadership development programmes on development, but do support an interactive effect between DR and leadership development programmes. Personality dispositions Agreeableness, Conscientiousness, and Openness to Experience and value orientations Conservation, Open, and Closed Orientation are found to significantly predict DR. Finally, the theoretical and practical implications of findings are discussed.
The impact of brand owner on consumers' brand perceptions : a development of Heider's Balance Theory
Resumo:
Studies have shown that the brand “owner” is very influential in positioning the brand and when the brand “owner” ceases his or her active role the brand will be perceived differently by the consumers. Balance Theory (HBT), a cognitive psychological theory, studies the triadic relationships between two persons and an entity and predicts that when a person’s original perception of the relationship is disturbed, the person restructures to a new balanced perception. Consequently, this research was undertaken to: conceptualize the brand owner’s impact on consumer’s brand perception; test the applicability of both the static and dynamic predictions of the Heider’s Balance Theory in brand owner-consumer-brand relation (OCB); construct and test a model of brand owner-consumer-brand relation; and examine if personality has an influence on OCB. A discovery-oriented approach was taken to understand the selected market segment, the ready-to-wear and diffusion lines of international designer labels. Chinese Brand Personality Scale, fashion proneness and hedonic and utilitarian shopping scales were developed, and validated. 51 customers were surveyed. Both traditional and extended methods used in the Balance Theory were employed in this study. Responses to liked brand have been used to test and develop the model, while those for disliked brand were used for test and confirmation. A “what if’ experimental approach was employed to test the applicability of dynamic HBT theory in OCB Model. The hypothesized OCB Model has been tested and validated. Consumers have been found to have separate views on the brand and the brand owner; and their responses to contrasting ethical and non-ethical news of the brand owner are different. Personality has been found to have an influence and two personality adapted models have been tested and validated. The actual results go beyond the prediction of the Balance Theory. Dominant triple positive balance mode, dominant negative balance mode, and mode of extreme antipathy have been found. It has been found that not all balanced modes are good for the brand. Contrary to Heider’s findings, simply liking may not necessarily lead to unit relation in the OCB Model.
Resumo:
There has been substantial research into the role of distance learning in education. Despite the rise in the popularity and practice of this form of learning in business, there has not been a parallel increase in the amount of research carried out in this field. An extensive investigation was conducted into the entire distance learning system of a multi-national company with particular emphasis on the design, implementation and evaluation of the materials. In addition, the performance and attitudes of trainees were examined. The results of a comparative study indicated that trainees using distance learning had significantly higher test scores than trainees using conventional face-to-face training. The influence of the previous distance learning experience, educational background and selected study environment of trainees was investigated. Trainees with previous experience of distance learning were more likely to complete the course and with significantly higher test scores than trainees with no previous experience. The more advanced the educational background of trainees, the greater the likelihood of their completing the course, although there was no significant difference in the test scores achieved. Trainees preferred to use the materials at home and those opting to study in this environment scored significantly higher than those studying in the office, the study room at work or in a combination of environments. The influence of learning styles (Kolb, 1976) was tested. The results indicated that the convergers had the greatest completion rates and scored significantly higher than trainees with the assimilator, accommodator and diverger learning styles. The attitudes of the trainees, supervisors and trainers were examined using questionnaire, interview and discussion techniques. The findings highlighted the potential problems of lack of awareness and low motivation which could prove to be major obstacles to the success of distance learning in business.
Resumo:
In vitro toxicity tests which detect evidence of the formation of reactive metabolites have previously relied upon cell death as a toxicity end point. Therefore these tests determine cytotoxicity in terms of quantitative changes in specified cell functions. In the studies involving the CaC0-2 cell model, there was no significant change in the transport of [3H] L-proline by the cell after eo-incubation with either dapsone or cyclophosphamide (50µM) and rat liver microsomal metabolite generating system. The pre incubation of the cells with N-ethylmalemide to inhibit Phase II sulphotransferase activity, prior to the microsomal incubations, resulted in cytotoxcity in all incubation groups. Studies involving the L6 cell model showed that there was no significant effect in the cell signalling pathway producing the second messenger cAMP, after incubation with dapsone or cyclophosphamide (50µM) and the rat microsomal metabolite generating system. There was also no significant affect on the vasopressin stimulated production of the second messenger IP3, after incubation with the hydroxylamine metabolite of dapsone, although there were some morphological changes observed with the cells at the highest concentration of dapsone hydroxylamine (100µM). With the test involving the NG115-401 L-C3 cell model, there was no significant changes in DNA synthesis in terms of [3H] thymidine incorporation, after eo-incubation with either phenytoin or cyclophosphamide (50µM) and the rat microsomal metabolite generating system. In the one compartment erythrocyte studies, there were significant decreases in glutathione with cyclophosphamide (50µM) (0.44 ± 0.04 mM), sulphamethoxazole (50µM) (0.43 ± 0.08mM) and carbamazepine (50µM) (0.47 ± 0.034 mM), when eoincubated with the rat microsomal system, compared to the control (0.52 ± 0.07mM). There was no significant depletion in glutathione when the erythrocytes were eoincubated with phenytoin and the rat microsomal system. In the two compartment erythrocyte studies, there was a significant decrease in the erythrocyte glutathione with cyclophosphamide (50µM) (0.953 ± 0110mM) when co-incubated the rat microsomal system, compared to the control (1.124 ± 0.032mM). Differences were considered statistically significant for p<0.05, using the Student's two tailed 't' test with Bonferroni's correction. There was no significant depletion of glutathione with phenytoin, carbamazepine and sulphamethoxazole when co-incubated with the rat microsomalsystem, compared to the control.
Resumo:
The human NT2.D1 cell line was differentiated to form both a 1:2 co-culture of post-mitotic NT2 neuronal and NT2 astrocytic (NT2.N/A) cells and a pure NT2.N culture. The respective sensitivities to several test chemicals of the NT2.N/A, the NT2.N, and the NT2.D1 cells were evaluated and compared with the CCF-STTG1 astrocytoma cell line, using a combination of basal cytotoxicity and biochemical endpoints. Using the MTT assay, the basal cytotoxicity data estimated the comparative toxicities of the test chemicals (chronic neurotoxin 2,5-hexanedione, cytotoxins 2,3- and 3,4-hexanedione and acute neurotoxins tributyltin- and trimethyltin- chloride) and also provided the non-cytotoxic concentration-range for each compound. Biochemical endpoints examined over the non-cytotoxic range included assays for ATP levels, oxidative status (H2O2 and GSH levels) and caspase-3 levels as an indicator of apoptosis. although the endpoints did not demonstrate the known neurotoxicants to be consistently more toxic to the cell systems with the greatest number of neuronal properties, the NT2 astrocytes appeared to contribute positively to NT2 neuronal health following exposure to all the test chemicals. The NT2.N/A co-culture generally maintained superior ATP and GSH levels and reduced H2O2 levels in comparison with the NT2.N mono-culture. In addition, the pure NT2.N culture showed a significantly lower level of caspase-3 activation compared with the co-culture, suggesting NT2 astrocytes may be important in modulating the mode of cell death following toxic insult. Overall, these studies provide evidence that an in vitro integrated population of post-mitotic human neurons and astrocytes may offer significant relevance to the human in vivo heterogeneous nervous system, when initially screening compounds for acute neurotoxic potential.
Resumo:
A description of the background to testing friction materials for automotive brakes explains the need for a rapid, inexpensive means of assessing their behaviour in a way which is both accurate and meaningful. Various methods of controlling inertia dynamometers to simulate road vehicles are rejected in favour of programming by means of a commercially available XY plotter. Investigation of brake service conditions is used to set up test schedules, and a dynamometer programming unit built to enable service conditions on vehicles to be simulated on a full scale dynamometer. A technique is developed by which accelerated testing can be achieved without operating under overload conditions, saving time and cost without sacrificing validity. The development of programming by XY plotter is described, with a method of operating one XY plotter to programme the machine, monitor its own behaviour, and plot its own results in logical sequence. Commissioning trials are described and the generation of reproducible results in frictional behaviour and material durability is discussed. Teclmiques are developed to cross check the operation of the machine in retrospect, and retrospectively correct results in the event of malfunctions. Sensitivity errors in the measuring circuits are displayed between calibrations, whilst leaving the recorded results almost unaffected by error. Typical results of brake lining tests are used to demonstrate the range of performance parameters which can be studied by use of the machine. Successful test investigations completed on the machine are reported, including comments on behaviour of cast iron drums and discs. The machine shows that materials can repeat their complex friction/ temperature/speed/pressure relationships at a reproducibility of the order of +-0.003u and +~ 0.0002 in. thickness loss during wear tests. Discussion of practical and academic implications completes the report with recommendations for further work in both fields.
Resumo:
Distortion or deprivation of vision during an early `critical' period of visual development can result in permanent visual impairment which indicates the need to identify and treat visually at-risk individuals early. A significant difficulty in this respect is that conventional, subjective methods of visual acuity determination are ineffective before approximately three years of age. In laboratory studies, infant visual function has been quantified precisely, using objective methods based on visual evoked potentials (VEP), preferential looking (PL) and optokinetic nystagmus (OKN) but clinical assessment of infant vision has presented a particular difficulty. An initial aim of this study was to evaluate the relative clinical merits of the three techniques. Clinical derivatives were devised, the OKN method proved unsuitable but the PL and VEP methods were evaluated in a pilot study. Most infants participating in the study had known ocular and/or neurological abnormalities but a few normals were included for comparison. The study suggested that the PL method was more clinically appropriate for the objective assessment of infant acuity. A study of normal visual development from birth to one year was subsequently conducted. Observations included cycloplegic refraction, ophthalmoscopy and preferential looking visual acuity assessment using horizontally and vertically oriented square wave gratings. The aims of the work were to investigate the efficiency and sensitivity of the technique and to study possible correlates of visual development. The success rate of the PL method varied with age; 87% of newborns and 98% of infants attending follow-up successfully completed at least one acuity test. Below two months monocular acuities were difficult to secure; infants were most testable around six months. The results produced were similar to published data using the acuity card procedure and slightly lower than, but comparable with acuity data derived using extended PL methods. Acuity development was not impaired in infants found to have retinal haemorrhages as newborns. A significant relationship was found between newborn binocular acuity and anisometropia but not with other refractive findings. No strong or consistent correlations between grating acuity and refraction were found for three, six or twelve months olds. Improvements in acuity and decreases in levels of hyperopia over the first week of life were suggestive of recovery from minor birth trauma. The refractive data was analysed separately to investigate the natural history of refraction in normal infants. Most newborns (80%) were hyperopic, significant astigmatism was found in 86% and significant anisometropia in 22%. No significant alteration in spherical equivalent refraction was noted between birth and three months, a significant reduction in hyperopia was evident by six months and this trend continued until one year. Observations on the astigmatic component of the refractive error revealed a rather erratic series of changes which would be worthy of further investigation since a repeat refraction study suggested difficulties in obtaining stable measurements in newborns. Astigmatism tended to decrease between birth and three months, increased significantly from three to six months and decreased significantly from six to twelve months. A constant decrease in the degree of anisometropia was evident throughout the first year. These findings have implications for the correction of infantile refractive error.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
Analysis of the use of ICT in the aerospace industry has prompted the detailed investigation of an inventory-planning problem. There is a special class of inventory, consisting of expensive repairable spares for use in support of aircraft operations. These items, called rotables, are not well served by conventional theory and systems for inventory management. The context of the problem, the aircraft maintenance industry sector, is described in order to convey some of its special characteristics in the context of operations management. A literature review is carried out to seek existing theory that can be applied to rotable inventory and to identify a potential gap into which newly developed theory could contribute. Current techniques for rotable planning are identified in industry and the literature: these methods are modelled and tested using inventory and operational data obtained in the field. In the expectation that current practice leaves much scope for improvement, several new models are proposed. These are developed and tested on the field data for comparison with current practice. The new models are revised following testing to give improved versions. The best model developed and tested here comprises a linear programming optimisation, which finds an optimal level of inventory for multiple test cases, reflecting changing operating conditions. The new model offers an inventory plan that is up to 40% less expensive than that determined by current practice, while maintaining required performance.
Resumo:
A combination of experimental methods was applied at a clogged, horizontal subsurface flow (HSSF) municipal wastewater tertiary treatment wetland (TW) in the UK, to quantify the extent of surface and subsurface clogging which had resulted in undesirable surface flow. The three dimensional hydraulic conductivity profile was determined, using a purpose made device which recreates the constant head permeameter test in-situ. The hydrodynamic pathways were investigated by performing dye tracing tests with Rhodamine WT and a novel multi-channel, data-logging, flow through Fluorimeter which allows synchronous measurements to be taken from a matrix of sampling points. Hydraulic conductivity varied in all planes, with the lowest measurement of 0.1 md1 corresponding to the surface layer at the inlet, and the maximum measurement of 1550 md1 located at a 0.4m depth at the outlet. According to dye tracing results, the region where the overland flow ceased received five times the average flow, which then vertically short-circuited below the rhizosphere. The tracer break-through curve obtained from the outlet showed that this preferential flow-path accounted for approximately 80% of the flow overall and arrived 8 h before a distinctly separate secondary flow-path. The overall volumetric efficiencyof the clogged system was 71% and the hydrology was simulated using a dual-path, dead-zone storage model. It is concluded that uneven inlet distribution, continuous surface loading and high rhizosphere resistance is responsible for the clog formation observed in this system. The average inlet hydraulic conductivity was 2 md1, suggesting that current European design guidelines, which predict that the system will reach an equilibrium hydraulic conductivity of 86 md1, do not adequately describe the hydrology of mature systems.
Resumo:
The development of a system that integrates reverse osmosis (RO) with a horticultural greenhouse has been advanced through laboratory experiments. In this concept, intended for the inland desalination of brackish groundwater in dry areas, the RO concentrate will be reduced in volume by passing it through the evaporative cooling pads of the greenhouse. The system will be powered by solar photovoltaics (PV). Using a solar array simulator, we have verified that the RO can operate with varying power input and recovery rates to meet the water demands for irrigation and cooling of a greenhouse in north-west India. Cooling requires ventilation by a fan which has also been built, tested and optimised with a PV module outdoors. Results from the experiments with these two subsystems (RO and fan) are compared to theoretical predictions to reach conclusions about energy usage, sizing and cost. For example, the optimal sizing for the RO system is 0.12–1.3 m2 of PV module per m2 of membrane, depending on feed salinity. For the fan, the PV module area equals that of the fan aperture. The fan consumes <30 J of electrical energy per m3 of air moved which is 3 times less than that of standard fans. The specific energy consumption of the RO, at 1–2.3 kWh ?m-3, is comparable to that reported by others. Now that the subsystems have been verifi ed, the next step will be to integrate and test the whole system in the field.