11 resultados para Team Evaluation Models

em Helda - Digital Repository of University of Helsinki


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the future the number of the disabled drivers requiring a special evaluation of their driving ability will increase due to the ageing population, as well as the progress of adaptive technology. This places pressure on the development of the driving evaluation system. Despite quite intensive research there is still no consensus concerning what is the factual situation in a driver evaluation (methodology), which measures should be included in an evaluation (methods), and how an evaluation has to be carried out (practise). In order to find answers to these questions we carried out empirical studies, and simultaneously elaborated upon a conceptual model for driving and a driving evaluation. The findings of empirical studies can be condensed into the following points: 1) A driving ability defined by the on-road driving test is associated with different laboratory measures depending on the study groups. Faults in the laboratory tests predicted faults in the on-road driving test in the novice group, whereas slowness in the laboratory predicted driving faults in the experienced drivers group. 2) The Parkinson study clearly showed that even an experienced clinician cannot reliably accomplish an evaluation of a disabled person’s driving ability without collaboration with other specialists. 3) The main finding of the stroke study was that the use of a multidisciplinary team as a source of information harmonises the specialists’ evaluations. 4) The patient studies demonstrated that the disabled persons themselves, as well as their spouses, are as a rule not reliable evaluators. 5) From the safety point of view, perceptible operations with the control devices are not crucial, but correct mental actions which the driver carries out with the help of the control devices are of greatest importance. 6) Personality factors including higher-order needs and motives, attitudes and a degree of self-awareness, particularly a sense of illness, are decisive when evaluating a disabled person’s driving ability. Personality is also the main source of resources concerning compensations for lower-order physical deficiencies and restrictions. From work with the conceptual model we drew the following methodological conclusions: First, the driver has to be considered as a holistic subject of the activity, as a multilevel hierarchically organised system of an organism, a temperament, an individuality, and a personality where the personality is the leading subsystem from the standpoint of safety. Second, driving as a human form of a sociopractical activity, is also a hierarchically organised dynamic system. Third, in an evaluation of driving ability it is a question of matching these two hierarchically organised structures: a subject of an activity and a proper activity. Fourth, an evaluation has to be person centred but not disease-, function- or method centred. On the basis of our study a multidisciplinary team (practitioner, driving school teacher, psychologist, occupational therapist) is recommended for use in demanding driver evaluations. Primary in a driver’s evaluations is a coherent conceptual model while concrete methods of evaluations may vary. However, the on-road test must always be performed if possible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important safety aspect to be considered when foods are enriched with phytosterols and phytostanols is the oxidative stability of these lipid compounds, i.e. their resistance to oxidation and thus to the formation of oxidation products. This study concentrated on producing scientific data to support this safety evaluation process. In the absence of an official method for analyzing of phytosterol/stanol oxidation products, we first developed a new gas chromatographic - mass spectrometric (GC-MS) method. We then investigated factors affecting these compounds' oxidative stability in lipid-based food models in order to identify critical conditions under which significant oxidation reactions may occur. Finally, the oxidative stability of phytosterols and stanols in enriched foods during processing and storage was evaluated. Enriched foods covered a range of commercially available phytosterol/stanol ingredients, different heat treatments during food processing, and different multiphase food structures. The GC-MS method was a powerful tool for measuring the oxidative stability. Data obtained in food model studies revealed that the critical factors for the formation and distribution of the main secondary oxidation products were sterol structure, reaction temperature, reaction time, and lipid matrix composition. Under all conditions studied, phytostanols as saturated compounds were more stable than unsaturated phytosterols. In addition, esterification made phytosterols more reactive than free sterols at low temperatures, while at high temperatures the situation was the reverse. Generally, oxidation reactions were more significant at temperatures above 100°C. At lower temperatures, the significance of these reactions increased with increasing reaction time. The effect of lipid matrix composition was dependent on temperature; at temperatures above 140°C, phytosterols were more stable in an unsaturated lipid matrix, whereas below 140°C they were more stable in a saturated lipid matrix. At 140°C, phytosterols oxidized at the same rate in both matrices. Regardless of temperature, phytostanols oxidized more in an unsaturated lipid matrix. Generally, the distribution of oxidation products seemed to be associated with the phase of overall oxidation. 7-ketophytosterols accumulated when oxidation had not yet reached the dynamic state. Once this state was attained, the major products were 5,6-epoxyphytosterols and 7-hydroxyphytosterols. The changes observed in phytostanol oxidation products were not as informative since all stanol oxides quantified represented hydroxyl compounds. The formation of these secondary oxidation products did not account for all of the phytosterol/stanol losses observed during the heating experiments, indicating the presence of dimeric, oligomeric or other oxidation products, especially when free phytosterols and stanols were heated at high temperatures. Commercially available phytosterol/stanol ingredients were stable during such food processes as spray-drying and ultra high temperature (UHT)-type heating and subsequent long-term storage. Pan-frying, however, induced phytosterol oxidation and was classified as a rather deteriorative process. Overall, the findings indicated that although phytosterols and stanols are stable in normal food processing conditions, attention should be paid to their use in frying. Complex interactions between other food constituents also suggested that when new phytosterol-enriched foods are developed their oxidative stability must first be established. The results presented here will assist in choosing safe conditions for phytosterol/stanol enrichment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Positron emission tomography (PET) is an imaging technique in which radioactive positron-emitting tracers are used to study biochemical and physiological functions in humans and in animal experiments. The use of PET imaging has increased rapidly in recent years, as have special requirements in the fields of neurology and oncology for the development of syntheses for new, more specific and selective radiotracers. Synthesis development and automation are necessary when high amounts of radioactivity are needed for multiple PET studies. In addition, preclinical studies using experimental animal models are necessary for evaluating the suitability of new PET tracers for humans. For purification and analysing the labelled end-product, an effective radioanalytical method combined with an optimal radioactivity detection technique is of great importance. In this study, a fluorine-18 labelling synthesis method for two tracers was developed and optimized, and the usefulness of these tracers for possible prospective human studies was evaluated. N-(3-[18F]fluoropropyl)-2β-carbomethoxy-3β-(4-fluorophenyl)nortropane ([18F]β-CFT-FP) is a candidate PET tracer for the dopamine transporter (DAT), and 1H-1-(3-[18F]fluoro-2-hydroxypropyl)-2-nitroimidazole ([18F]FMISO) is a well-known hypoxia marker for hypoxic but viable cells in tumours. The methodological aim of this thesis was to evaluate the status of thin-layer chromatography (TLC) combined with proper radioactivity detection measurement systems as a radioanalytical method. Three different detection methods of radioactivity were compared: radioactivity scanning, film autoradiography, and digital photostimulated luminescence (PSL) autoradiography. The fluorine-18 labelling synthesis for [18F]β-CFT-FP was developed and carbon-11 labelled [11C]β-CFT-FP was used to study the specificity of β-CFT-FP for the DAT sites in human post-mortem brain slices. These in vitro studies showed that β-CFT-FP binds to the caudate-putamen, an area rich of DAT. The synthesis of fluorine-18 labelled [18F]FMISO was optimized, and the tracer was prepared using an automated system with good and reproducible yields. In preclinical studies, the action of the radiation sensitizer estramustine phosphate on the radiation treatment and uptake of [18F]FMISO was evaluated, with results of great importance for later human studies. The methodological part of this thesis showed that radioTLC is the method of choice when combined with an appropriate radioactivity detection technique. Digital PSL autoradiography proved to be the most appropriate when compared to the radioactivity scanning and film autoradiography methods. The very high sensitivity, good resolution, and wide dynamic range of digital PSL autoradiography are its advantages in detection of β-emitting radiolabelled substances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last 10-15 years interest in mouse behavioural analysis has evolved considerably. The driving force is development in molecular biological techniques that allow manipulation of the mouse genome by changing the expression of genes. Therefore, with some limitations it is possible to study how genes participate in regulation of physiological functions and to create models explaining genetic contribution to various pathological conditions. The first aim of our study was to establish a framework for behavioural phenotyping of genetically modified mice. We established comprehensive battery of tests for the initial screening of mutant mice. These included tests for exploratory and locomotor activity, emotional behaviour, sensory functions, and cognitive performance. Our interest was in the behavioural patterns of common background strains used for genetic manipulations in mice. Additionally we studied the behavioural effect of sex differences, test history, and individual housing. Our findings highlight the importance of careful consideration of genetic background for analysis of mutant mice. It was evident that some backgrounds may mask or modify the behavioural phenotype of mutants and thereby lead to false positive or negative findings. Moreover, there is no universal strain that is equally suitable for all tests, and using different backgrounds allows one to address possible phenotype modifying factors. We discovered that previous experience affected performance in several tasks. The most sensitive traits were the exploratory and emotional behaviour, as well as motor and nociceptive functions. Therefore, it may be essential to repeat some of the tests in naïve animals for assuring the phenotype. Social isolation for a long time period had strong effects on exploratory behaviour, but also on learning and memory. All experiments revealed significant interactions between strain and environmental factors (test history or housing condition) indicating genotype-dependent effects of environmental manipulations. Several mutant line analyses utilize this information. For example, we studied mice overexpressing as well as those lacking extracellular matrix protein heparin-binding growth-associated molecule (HB-GAM), and mice lacking N-syndecan (a receptor for HB-GAM). All mutant mice appeared to be fertile and healthy, without any apparent neurological or sensory defects. The lack of HB-GAM and N-syndecan, however, significantly reduced the learning capacity of the mice. On the other hand, overexpression of HB-GAM resulted in facilitated learning. Moreover, HB-GAM knockout mice displayed higher anxiety-like behaviour, whereas anxiety was reduced in HB-GAM overexpressing mice. Changes in hippocampal plasticity accompanied the behavioural phenotypes. We conclude that HB-GAM and N-syndecan are involved in the modulation of synaptic plasticity in hippocampus and play a role in regulation of anxiety- and learning-related behaviour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis contains three subject areas concerning particulate matter in urban area air quality: 1) Analysis of the measured concentrations of particulate matter mass concentrations in the Helsinki Metropolitan Area (HMA) in different locations in relation to traffic sources, and at different times of year and day. 2) The evolution of traffic exhaust originated particulate matter number concentrations and sizes in local street scale are studied by a combination of a dispersion model and an aerosol process model. 3) Some situations of high particulate matter concentrations are analysed with regard to their meteorological origins, especially temperature inversion situations, in the HMA and three other European cities. The prediction of the occurrence of meteorological conditions conducive to elevated particulate matter concentrations in the studied cities is examined. The performance of current numerical weather forecasting models in the case of air pollution episode situations is considered. The study of the ambient measurements revealed clear diurnal variation of the PM10 concentrations in the HMA measurement sites, irrespective of the year and the season of the year. The diurnal variation of local vehicular traffic flows seemed to have no substantial correlation with the PM2.5 concentrations, indicating that the PM10 concentrations were originated mainly from local vehicular traffic (direct emissions and suspension), while the PM2.5 concentrations were mostly of regionally and long-range transported origin. The modelling study of traffic exhaust dispersion and transformation showed that the number concentrations of particles originating from street traffic exhaust undergo a substantial change during the first tens of seconds after being emitted from the vehicle tailpipe. The dilution process was shown to dominate total number concentrations. Minimal effect of both condensation and coagulation was seen in the Aitken mode number concentrations. The included air pollution episodes were chosen on the basis of occurrence in either winter or spring, and having at least partly local origin. In the HMA, air pollution episodes were shown to be linked to predominantly stable atmospheric conditions with high atmospheric pressure and low wind speeds in conjunction with relatively low ambient temperatures. For the other European cities studied, the best meteorological predictors for the elevated concentrations of PM10 were shown to be temporal (hourly) evolutions of temperature inversions, stable atmospheric stability and in some cases, wind speed. Concerning the weather prediction during particulate matter related air pollution episodes, the use of the studied models were found to overpredict pollutant dispersion, leading to underprediction of pollutant concentration levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis studies quantile residuals and uses different methodologies to develop test statistics that are applicable in evaluating linear and nonlinear time series models based on continuous distributions. Models based on mixtures of distributions are of special interest because it turns out that for those models traditional residuals, often referred to as Pearson's residuals, are not appropriate. As such models have become more and more popular in practice, especially with financial time series data there is a need for reliable diagnostic tools that can be used to evaluate them. The aim of the thesis is to show how such diagnostic tools can be obtained and used in model evaluation. The quantile residuals considered here are defined in such a way that, when the model is correctly specified and its parameters are consistently estimated, they are approximately independent with standard normal distribution. All the tests derived in the thesis are pure significance type tests and are theoretically sound in that they properly take the uncertainty caused by parameter estimation into account. -- In Chapter 2 a general framework based on the likelihood function and smooth functions of univariate quantile residuals is derived that can be used to obtain misspecification tests for various purposes. Three easy-to-use tests aimed at detecting non-normality, autocorrelation, and conditional heteroscedasticity in quantile residuals are formulated. It also turns out that these tests can be interpreted as Lagrange Multiplier or score tests so that they are asymptotically optimal against local alternatives. Chapter 3 extends the concept of quantile residuals to multivariate models. The framework of Chapter 2 is generalized and tests aimed at detecting non-normality, serial correlation, and conditional heteroscedasticity in multivariate quantile residuals are derived based on it. Score test interpretations are obtained for the serial correlation and conditional heteroscedasticity tests and in a rather restricted special case for the normality test. In Chapter 4 the tests are constructed using the empirical distribution function of quantile residuals. So-called Khmaladze s martingale transformation is applied in order to eliminate the uncertainty caused by parameter estimation. Various test statistics are considered so that critical bounds for histogram type plots as well as Quantile-Quantile and Probability-Probability type plots of quantile residuals are obtained. Chapters 2, 3, and 4 contain simulations and empirical examples which illustrate the finite sample size and power properties of the derived tests and also how the tests and related graphical tools based on residuals are applied in practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this study was to examine the role of organizational causal attribution in understanding the relation of work stressors (work-role overload, excessive role responsibility, and unpleasant physical environment) and personal resources (social support and cognitive coping) to such organizational-attitudinal outcomes as work engagement, turnover intention, and organizational identification. In some analyses, cognitive coping was also treated as an organizational outcome. Causal attribution was conceptualized in terms of four dimensions: internality-externality, attributing the cause of one’s successes and failures to oneself, as opposed to external factors, stability (thinking that the cause of one’s successes and failures is stable over time), globality (perceiving the cause to be operative on many areas of one’s life), and controllability (believing that one can control the causes of one’s successes and failures). Several hypotheses were derived from Karasek’s (1989) Job Demands–Control (JD-C) model and from the Job Demands–Resources (JD-R) model (Demerouti, Bakker, Nachreiner & Schaufeli, 2001). Based on the JD-C model, a number of moderation effects were predicted, stating that the strength of the association of work stressors with the outcome variables (e.g. turnover intentions) varies as a function of the causal attribution; for example, unpleasant work environment is more strongly associated with turnover intention among those with an external locus of causality than among those with an internal locuse of causality. From the JD-R model, a number of hypotheses on the mediation model were derived. They were based on two processes posited by the model: an energy-draining process in which work stressors along with a mediating effect of causal attribution for failures deplete the nurses’ energy, leading to turnover intention, and a motivational process in which personal resources along with a mediating effect of causal attribution for successes foster the nurses’ engagement in their work, leading to higher organizational identification and to decreased intention to leave the nursing job. For instance, it was expected that the relationship between work stressors and turnover intention could be explained (mediated) by a tendency to attribute one’s work failures to stable causes. The data were collected from among Finnish hospital nurses using e-questionnaires. Overall 934 nurses responded the questionnaires. Work stressors and personal resources were measured by five scales derived from the Occupational Stress Inventory-Revised (Osipow, 1998). Causal attribution was measured using the Occupational Attributional Style Questionnaire (Furnham, 2004). Work engagement was assessed through the Utrecht Work Engagement Scale (Schaufeli & al., 2002), turnover intention by the Van Veldhoven & Meijman (1994) scale, and organizational identification by the Mael & Ashforth (1992) measure. The results provided support for the function of causal attribution in the overall work stress process. Findings related to the moderation model can be divided into three main findings. First, external locus of causality along with job level moderated the relationship between work overload and cognitive coping. Hence, this interaction was evidenced only among nurses in non-supervisory positions. Second, external locus of causality and job level together moderated the relationship between physical environment and turnover intention. An opposite pattern of interaction was found for this interaction: among nurses, externality exacerbated the effect of perceived unpleasantness of the physical environment on turnover intention, whereas among supervisors internality produced the same effect. Third, job level also disclosed a moderation effect for controllability attribution over the relationship between physical environment and cognitive coping. Findings related to the mediation model for the energetic process indicated that the partial model in which work stressors have also a direct effect on turnover intention fitted the data better. In the mediation model for the motivational process, an intermediate mediation effect in which the effects of personal resources on turnover intention went through two mediators (e.g., causal dimensions and organizational identification) fitted the data better. All dimensions of causal attribution appeared to follow a somewhat unique pattern of mediation effect not only for energetic but also for motivational processes. Overall findings on mediation models partly supported the two simultaneous underlying processes proposed by the JD-R model. While in the energetic process the dimension of externality mediated the relationship between stressors and turnover partially, all the dimensions of causal attribution appeared to entail significant mediator effects in the motivational process. The general findings supported the moderation effect and the mediation effect of causal attribution in the work stress process. The study contributes to several research traditions, including the interaction approach, the JD-C, and the JD-R models. However, many potential functions of organizational causal attribution are yet to be evaluated by relevant academic and organizational research. Keywords: organizational causal attribution, optimistic / pessimistic attributional style, work stressors, organisational stress process, stressors in nursing profession, hospital nursing, JD-R model, personal resources, turnover intention, work engagement, organizational identification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hybrid innovations, or new products that combine two existing product categories into one, are increasingly popular in today’s marketplace. Despite this proliferation, few studies address them. The purpose of this thesis is to examine consumer evaluation of hybrid innovations by focusing on consumer categorization of such innovations and on factors contributing positively and negatively to their evaluation. This issue is examined by means of three studies. The first study addresses the proportion of consumers categorizing hybrid products as single- versus dual-purpose, what contributes to such a categorization, what differences can be found between the two groups, and if categorization can and should be included in models of innovation adoption. The second study expands on the scope by including motivation as a predictor of consumer evaluation and examines two cognitive and affective factors and their differential impact on innovation evaluation. Finally, the third study examines the product comparisons single- versus dual-purpose categorization induce. These three essays together build up a broader understanding of hybrid innovation evaluation. The thesis uses theories from both psychology and marketing to examine the issues at hand. Conceptual combination and analogical learning theories from psychology are used to comprehend categorization and knowledge transfer. From marketing, consumer behavior and innovation adoption studies are addressed to better understand the link between categorization and product evaluation and the factors contributing to product evaluation. The main results of the current thesis are that (1) most consumers categorize hybrid products as single- and not as dual-purpose products, (2) consumers that categorize them as dual-purpose find them more attractive (3) motivation has a significant effect on consumer evaluation of innovations; cognitive factors promote an emphasis on product net benefits, whereas affective factors induce consumers to consider product meaning in the form of categorization and perceived product complexity, (4) categorization constrains subsequent product evaluation, and (5) categorization can and should be included to models of innovation adoption. Maria Sääksjärvi is associated with CERS, the Center for Relationship Marketing and Service Management at the Swedish School of Economics and Business Administration

Relevância:

30.00% 30.00%

Publicador:

Resumo:

XVIII IUFRO World Congress, Ljubljana 1986.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In meteorology, observations and forecasts of a wide range of phenomena for example, snow, clouds, hail, fog, and tornados can be categorical, that is, they can only have discrete values (e.g., "snow" and "no snow"). Concentrating on satellite-based snow and cloud analyses, this thesis explores methods that have been developed for evaluation of categorical products and analyses. Different algorithms for satellite products generate different results; sometimes the differences are subtle, sometimes all too visible. In addition to differences between algorithms, the satellite products are influenced by physical processes and conditions, such as diurnal and seasonal variation in solar radiation, topography, and land use. The analysis of satellite-based snow cover analyses from NOAA, NASA, and EUMETSAT, and snow analyses for numerical weather prediction models from FMI and ECMWF was complicated by the fact that we did not have the true knowledge of snow extent, and we were forced simply to measure the agreement between different products. The Sammon mapping, a multidimensional scaling method, was then used to visualize the differences between different products. The trustworthiness of the results for cloud analyses [EUMETSAT Meteorological Products Extraction Facility cloud mask (MPEF), together with the Nowcasting Satellite Application Facility (SAFNWC) cloud masks provided by Météo-France (SAFNWC/MSG) and the Swedish Meteorological and Hydrological Institute (SAFNWC/PPS)] compared with ceilometers of the Helsinki Testbed was estimated by constructing confidence intervals (CIs). Bootstrapping, a statistical resampling method, was used to construct CIs, especially in the presence of spatial and temporal correlation. The reference data for validation are constantly in short supply. In general, the needs of a particular project drive the requirements for evaluation, for example, for the accuracy and the timeliness of the particular data and methods. In this vein, we discuss tentatively how data provided by general public, e.g., photos shared on the Internet photo-sharing service Flickr, can be used as a new source for validation. Results show that they are of reasonable quality and their use for case studies can be warmly recommended. Last, the use of cluster analysis on meteorological in-situ measurements was explored. The Autoclass algorithm was used to construct compact representations of synoptic conditions of fog at Finnish airports.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present simple methods for construction and evaluation of finite-state spell-checking tools using an existing finite-state lexical automaton, freely available finite-state tools and Internet corpora acquired from projects such as Wikipedia. As an example, we use a freely available open-source implementation of Finnish morphology, made with traditional finite-state morphology tools, and demonstrate rapid building of Northern Sámi and English spell checkers from tools and resources available from the Internet.