38 resultados para Evaluation Studies as Topic


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the METKU Project (Development of Maritime Safety Culture) is to study how the ISM Code has influenced the safety culture in the maritime industry. This literature review is written as a part of the Work Package 2 which is conducted by the University of Turku, Centre for Maritime Studies. The maritime traffic is rapidly growing in the Baltic Sea which leads to a growing risk of maritime accidents. Particularly in the Gulf of Finland, the high volume of traffic causes a high risk of maritime accidents. The growing risks give us good reasons for implementing the research project concerning maritime safety and the effectiveness of the safety measures, such as the safety management systems. In order to reduce maritime safety risks, the safety management systems should be further developed. The METKU Project has been launched to examine the improvements which can be done to the safety management systems. Human errors are considered as the most important reason for maritime accidents. The international safety management code (the ISM Code) has been established to cut down the occurrence of human errors by creating a safety-oriented organizational culture for the maritime industry. The ISM Code requires that a company should <i>provide safe practices in ship operation and a safe working environment and establish safeguards against all identified risk.</i> The fundamental idea of the ISM Code is that companies should continuously improve safety. The commitment of the top management is essential for implementing a safety-oriented culture in a company. The ISM Code has brought a significant contribution to the progress of maritime safety in recent years. Shipping companies and ships crews are more environmentally friendly and more safety-oriented than 12 years ago. This has been showed by several studies which have been analysed for this literature research. Nevertheless, the direct effect and influence of the ISM Code on maritime safety could not be isolated very well. No quantitative measurement (statistics/hard data) could be found in order to present the impacts of the ISM Code on maritime safety. In this study it has been discovered that safety culture has emerged and it is developing in the maritime industry. Even though the roots of the safety culture have been established there are still serious barriers to the breakthrough of the safety management. These barriers could be envisaged as cultural factors preventing the safety process. Even though the ISM Code has been effective over a decade, the old-established behaviour which is based on the old days maritime culture still occurs. In the next phase of this research project, these cultural factors shall be analysed in regard to the present safety culture of the maritime industry in Finland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neste Oil has introduced plant oils and animal fats for the production of NExBTL renewable diesel, and these raw materials differ from the conventional mineral based oils. One subject of new raw materials study is thermal degradation, or in another name pyrolysis, of these organic oils and fats. The aim of this masters thesis is to increase knowledge on thermal degradation of these new raw materials, and to identify possible gaseous harmful thermal degradation compounds. Another aim is to de-termine the health and environmental hazards of identified compounds. One objective is also to examine the formation possibilities of hazardous compounds in the produc-tion of NExBTL-diesel. Plant oils and animal fats consist mostly of triglycerides. Pyrolysis of triglycerides is a complex phenomenon, and many degradation products can be formed. Based on the literature studies, 13 hazardous degradation products were identified, one of which was acrolein. This compound is very toxic and dangerous to the environment. Own pyrolysis experiments were carried out with rapeseed and palm oils, and with a mix-ture of palm oil and animal fat. At least 12 hazardous compounds, including acrolein, were analysed from the gas phase. According to the experiments, the factors which influence on acrolein formation are the time of the experiment, the sphere (air/hydrogen) in which the experiment is carried out, and the characteristics of the used oil. The production of NExBTL-diesel is not based on pyrolysis. This is why thermal degradation is possible only when abnormal process conditions prevail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Validation and verification operations encounter various challenges in product development process. Requirements for increasing the development cycle pace set new requests for component development process. Verification and validation usually represent the largest activities, up to 40 50 % of R&D resources utilized. This research studies validation and verification as part of case company's component development process. The target is to define framework that can be used in improvement of the validation and verification capability evaluation and development in display module development projects. Validation and verification definition and background is studied in this research. Additionally, theories such as project management, system, organisational learning and causality is studied. Framework and key findings of this research are presented. Feedback system according of the framework is defined and implemented to the case company. This research is divided to the theory and empirical parts. Theory part is conducted in literature review. Empirical part is done in case study. Constructive methode and design research methode are used in this research A framework for capability evaluation and development was defined and developed as result of this research. Key findings of this study were that double loop learning approach with validation and verification V+ model enables defining a feedback reporting solution. Additional results, some minor changes in validation and verification process were proposed. There are a few concerns expressed on the results on validity and reliability of this study. The most important one was the selected research method and the selected model itself. The final state can be normative, the researcher may set study results before the actual study and in the initial state, the researcher may describe expectations for the study. Finally reliability of this study, and validity of this work are studied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The role of dopamine and serotonin in spinal pain regulation is well established. However, little is known concerning the role of brain dopamine and serotonin in the perception of pain in humans. The aim of this study was to assess the potential role of brain dopamine and serotonin in determining experimental pain sensitivity in humans using positron emission tomography (PET) and psychophysical methods. A total of 39 healthy subjects participated in the study, and PET imaging was performed to assess brain dopamine D2/D3 and serotonin 5-HT<sub>1A</sub> receptor availability. In a separate session, sensitivity to pain and touch was assessed with traditional psychophysical methods, allowing the evaluation of potential associations between D2/D3 and 5-HT<sub>1A</sub> binding and psychophysical responses. The subjects responses were also analyzed according to Signal Detection Theory, which enables separate assessment of the subjects discriminative capacity (sensory factor) and response criterion (non-sensory factor). The study found that the D2/D3 receptor binding in the right putamen was inversely correlated with pain threshold and response criterion. 5-HT<sub>1A</sub> binding in cingulate cortex, inferior temporal gyrus and medial prefrontal cortex was inversely correlated with discriminative capacity for touch. Additionally, the response criterion for pain and intensity rating of suprathreshold pain were inversely correlated with 5-HT<sub>1A</sub> binding in multiple brain areas. The results suggest that brain D2/D3 receptors and 5-HT<sub>1A</sub> receptors modulate sensitivity to pain and that the pain modulatory effects may, at least partly, be attributed to influences on the response criterion. 5-HT<sub>1A</sub> receptors are also involved in the regulation of touch by having an effect on discriminative capacity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atherosclerosis is a vascular inflammatory disease causing coronary artery disease, myocardial infarct and stroke, the leading causes of death in Finland and in many other countries. The development of atherosclerotic plaques starts already in childhood and is an ongoing process throughout life. Rupture of a plaque and the following occlusion of the vessel is the main reason for myocardial infarct and stroke, but despite extensive research, the prediction of rupture remains a major clinical problem. Inflammation is considered a key factor in the vulnerability of plaques to rupture. Measuring the inflammation in plaques non-invasively is one potential approach for identification of vulnerable plaques. The aim of this study was to evaluate tracers for positron emission tomography (PET) imaging of vascular inflammation. The studies were performed with a mouse model of atherosclerosis by using <i>ex vivo</i> biodistribution, autoradiography and <i>in vivo</i> PET and computed tomography (CT). Several tracers for inflammation activity were tested and compared with the morphology of the plaques. Inflammation in the atherosclerotic plaques was evaluated as expression of active macrophages. Systematic analysis revealed that the uptake of 18F-FDG and 11C-choline, tracers for metabolic activity in inflammatory cells, was more prominent in the atherosclerotic plaques than in the surrounding healthy vessel wall. The tracer for v3 integrin, 18Fgalacto- RGD, was also found to have high potential for imaging inflammation in the plaques. While 11C-PK11195, a tracer targeted to receptors in active macrophages, was shown to accumulate in active plaques, the target-to-background ratio was not found to be ideal for in vivo imaging purposes. In conclusion, tracers for the imaging of inflammation in atherosclerotic plaques can be tested in experimental pre-clinical settings to select potential imaging agents for further clinical testing. 18F-FDG, 18F-galacto-RGD and 11C-choline choline have good properties, and further studies to clarify their applicability for atherosclerosis imaging in humans are warranted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Print quality and the printability of paper are very important attributes when modern printing applications are considered. In prints containing images, high print quality is a basic requirement. Tone unevenness and non uniform glossiness of printed products are the most disturbing factors influencing overall print quality. These defects are caused by non ideal interactions of paper, ink and printing devices in high speed printing processes. Since print quality is a perceptive characteristic, the measurement of unevenness according to human vision is a significant problem. In this thesis, the mottling phenomenon is studied. Mottling is a printing defect characterized by a spotty, non uniform appearance in solid printed areas. Print mottle is usually the result of uneven ink lay down or non uniform ink absorption across the paper surface, especially visible in mid tone imagery or areas of uniform color, such as solids and continuous tone screen builds. By using existing knowledge on visual perception and known methods to quantify print tone variation, a new method for print unevenness evaluation is introduced. The method is compared to previous results in the field and is supported by psychometric experiments. Pilot studies are made to estimate the effect of optical paper characteristics prior to printing, on the unevenness of the printed area after printing. Instrumental methods for print unevenness evaluation have been compared and the results of the comparison indicate that the proposed method produces better results in terms of visual evaluation correspondence. The method has been successfully implemented as ail industrial application and is proved to be a reliable substitute to visual expertise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This epublication contains papers that were presented at the conference Assessing Language and (Inter) cultural Competences in Higher Education which took place at the University of Turku (Finland) on 30.8.1.9.2007. The online proceedings may be downloaded and used provided the source is acknowledged.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

<b>Studies on <sup>68</sup>Ga-Based Agents for PET Imaging of Cancer and Inflammation</b> Positron emission tomography (PET) is based on the use of radiolabeled agents and facilitates in vivo imaging of biological processes, such as cancer. Because the detection of cancer is demanding and is often obscured by inflammation, there is a demand for better PET imaging agents. The aim was to preliminarily evaluate new PET agents for imaging cancer and inflammation using experimental models. <sup>68</sup>Ga-chloride and peptides, <sup>68</sup>Ga-labeled through 1,4,7,10-tetraazacyclododecane-1,4,7,10-tetraacetic acid (DOTA), targeting matrix metalloproteinase-9 (MMP-9) were tested for tumor imaging. In addition, a <sup>68</sup>Ga-DOTA-conjugated peptide targeting vascular adhesion protein-1 (VAP-1), was tested for inflammation imaging. The <sup>68</sup>Ga-based imaging agents described here showed potential features by passing the essential <i>in vitro</i> tests, proceeding further to preclinical <i>in vivo</i> evaluation and being able to visualize the target. The target uptake and target-to-background ratios of <sup>68</sup>Ga-based agents were, however, not optimal. <sup>68</sup>Ga-chloride showed slow clearance caused by its binding to blood transferrin. In the case of <sup>68</sup>Ga-DOTA-peptides low <i>in vivo</i> stability and/or low lipophilicity led to too rapid blood clearance and urinary excretion. The properties of <sup>68</sup>Ga-labeled peptides are modifiable, as shown with matrix metalloproteinase-9 targeting ligands. In the conclusion of this PhD thesis, <sup>68</sup>Ga-based agents for PET imaging of cancer and inflammation could be applied in the development of drugs, earlier diagnostics and following-up of the efficacy of therapies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at bo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Programming and mathematics are core areas of computer science (CS) and consequently also important parts of CS education. Introductory instruction in these two topics is, however, not without problems. Studies show that CS students find programming difficult to learn and that teaching mathematical topics to CS novices is challenging. One reason for the latter is the disconnection between mathematics and programming found in many CS curricula, which results in students not seeing the relevance of the subject for their studies. In addition, reports indicate that students' mathematical capability and maturity levels are dropping. The challenges faced when teaching mathematics and programming at CS departments can also be traced back to gaps in students' prior education. In Finland the high school curriculum does not include CS as a subject; instead, focus is on learning to use the computer and its applications as tools. Similarly, many of the mathematics courses emphasize application of formulas, while logic, formalisms and proofs, which are important in CS, are avoided. Consequently, high school graduates are not well prepared for studies in CS. Motivated by these challenges, the goal of the present work is to describe new approaches to teaching mathematics and programming aimed at addressing these issues: Structured derivations is a logic-based approach to teaching mathematics, where formalisms and justifications are made explicit. The aim is to help students become better at communicating their reasoning using mathematical language and logical notation at the same time as they become more confident with formalisms. The Python programming language was originally designed with education in mind, and has a simple syntax compared to many other popular languages. The aim of using it in instruction is to address algorithms and their implementation in a way that allows focus to be put on learning algorithmic thinking and programming instead of on learning a complex syntax. Invariant based programming is a diagrammatic approach to developing programs that are correct by construction. The approach is based on elementary propositional and predicate logic, and makes explicit the underlying mathematical foundations of programming. The aim is also to show how mathematics in general, and logic in particular, can be used to create better programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous studies on pencil grip have typically dealt with the developmental aspects in young children while handwriting research is mainly concerned with speed and legibility. Studies linking these areas are few. Evaluation of the existing pencil grip studies is hampered by methodological inconsistencies. The operational definitions of pencil grip arerational but tend to be oversimplified while detailed descriptors tend to be impractical due to their multiplicity. The present study introduces a descriptive two-dimensional model for the categorisation of pencil grip suitable for research applications in a classroom setting. The model is used in four empirical studies of children during the first six years of writing instruction. Study 1 describes the pencil grips observed in a large group of pupils in Finland (n = 504). The results indicate that in Finland the majority of grips resemble the traditional dynamic tripod grip. Significant genderrelated differences in pencil grip were observed. Study 2 is a longitudinal exploration of grip stability vs. change (n = 117). Both expected and unexpected changes were observed in about 25 per cent of the children's grips over four years. A new finding emerged using the present model for categorisation: whereas pencil grips would change, either in terms of ease of grip manipulation or grip configuration, no instances were found where a grip would have changed concurrently on both dimensions. Study 3 is a cross-cultural comparison of grips observed in Finland and the USA (n = 793). The distribution of the pencil grips observed in the American pupils was significantly different from those found in Finland. The cross-cultural disparity is most likely related to the differences in the onset of writing instruction. The differences between the boys' and girls' grips in the American group were non-significant.An implication of Studies 2 and 3 is that the initial pencil grip is of foremost importance since pencil grips are largely stable over time. Study 4 connects the pencil grips to assessment of the mechanics of writing (n = 61). It seems that certain previously not recommended pencil grips might nevertheless be includedamong those accepted since they did not appear to hamper either fluency or legibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The potential for enhancing the energy efficiency of industrial pumping processes is estimated to be in some cases up to 50 %. One way to define further this potential is to implement techniques in accordance to definition of best available techniques in pumping applications. These techniques are divided into three main categories: Design, control method & maintenance and distribution system. In the theory part of this thesis first the definition of best available techniques (BAT) and its applicability on pumping processes is issued. Next, the theory around pumping with different pump types is handled, the main stress being in centrifugal pumps. Other components needed in a pumping process are dealt by presenting different control methods, use of an electric motor, variable speed drive and the distribution system. Last part of the theory is about industrial pumping processes from water distribution, sewage water and power plant applications, some of which are used further on in the empirical part as example cases. For the empirical part of this study four case studies on typical pumping processes from older Masters these were selected. Firstly the original results were analyzed by studying the distribution of energy consumption between different system components and using the definition of BAT in pumping, possible ways to improve energy efficiency were evaluated. The goal in this study was that by the achieved results it would be possible to identify the characteristic energy consumption of these and similar pumping processes. Through this data it would then be easier to focus energy efficiency actions where they might be the most applicable, both technically and economically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this two-phase study was to define the concept of vaccination competence and assess the vaccination competence of graduating public health nurse students (PHN students) and public health nurses (PHNs) in Finland, with the goal of promoting and maintaining vaccination competence and developing vaccination education. The first phase of the study included semi-structured interviews with vaccination professionals, graduating PHN students and clients (a total of n=40), asking them to describe vaccination competence as well as the factors strengthening and weakening it. The data were analyzed through content analysis. In the second phase of the study, structured instruments were developed, and vaccination competence of PHN students (n=129) in Finland and PHNs (n=405) was assessed using a self-assessment scale (VAS) and taking a knowledge test. PHNs were used as a reference group, enabling us to determine whether a satisfactory level of vaccination competence was achieved by the end of studies, or whether it was gained through work experience vaccinating clients. The data were collected from five polytechnic institutions and seven health centers located in various parts of the country. The data were collected using instruments developed for this study, and were analyzed statistically. In the first phase, based on the results of the interviews, vaccination competence was defined as a large multi-faceted entity, including the concepts of competent vaccinator, competent implementation of the vaccination, and the outcome of the implementation. Semi-structured interviews revealed that factors strengthening and weakening vaccination competence were connected to the vaccinator, the client being vaccinated, the vaccination environment and vaccinator education. On the whole, factors strengthening and weakening vaccination were the opposite of each other. In the second phase, on the self-assessment of vaccination competence, students rated themselves as significantly lower than working professionals. On the knowledge test, the percentage of correct answers was lower for students than PHNs. When all background variables were taken into account in multivariate analysis, there was no longer a significant difference between the students and PHNs on the self-assessment. However, in multivariate analysis, the PHNs still performed better than students on the knowledge test. For this study, a satisfactory level of vaccination competence was defined as a mean of 8.0 on the self-assessment and 80% correct answers on the knowledge test. Based on these criteria, students almost reached the level of satisfactory in their overall self-assessment, and PHNs did. Both groups, however, did rank themselves as satisfactory in some sum variables. On the knowledge test the students did not achieve a level of satisfactory (80%) in their total score, though PHNs did. As before, both groups did achieve a level of satisfactory in several sum variables. Further research and development should focus on vaccination education, the testing of vaccination competence and vaccination practices in clinical practice, as well as on developing the measurement tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dysfunction of the dopaminergic system in brain is involved in several pathological conditions such as Parkinsons disease and depression. 2-Carbomethoxy-3-(4-[18F] fluorophenyl)tropane ([18F]CFT) and 6-[18F]fluoro-L-dopa ([18F]FDOPA) are tracers for imaging the dopaminergic function with positron emission tomography (PET). Peripheral uptake of [18F]FDOPA is also used in the localization and diagnosis of neuroendocrine tumors. [18F]FDOPA and [18F]CFT can be synthesized by electrophilic fluorodestannylation. However, the specific radioactivity (SA) in the electrophilic fluorination is low with traditional synthetic methods. In this study, [18F]FDOPA and [18F]CFT were synthesized using post-target-produced [18F]F2 as an electrophilic fluorination agent. With this method, tracers are produced with sufficient SA for neuroreceptor studies. Specific aims in this study were to replace Freon-11 in the production of [18F]FDOPA due to the ozone depleting properties of this solvent, to determine pharmacological specificity and selectivity of [18F]CFT with respect to monoamine transporters, and to compare the ability of these tracers to reflect the degree of nigral neuronal loss in rats in which the dopaminergic system in the brain had been unilaterally destroyed by 6- OHDA. Post-target-produced [18F]F2 was successfully used in the production of [18F]FDOPA and [18F]CFT. The SA achieved was substantially higher than in previous synthetic methods. Deuterated compounds, CD2Cl2, CDCl3 and C3D6O, were found to be suitable solvents for replacing Freon-11. Both [18F]FDOPA and [18F]CFT demonstrated nigrostriatal dopaminergic hypofunction and correlated with the number of nigral dopaminergic neurons in the 6-OHDA lesioned rat. However, the dopamine transporter (DAT) tracer [18F]CFT was more sensitive than the dopamine synthesis tracer [18F]FDOPA in detecting these defects because of the higher non-specific uptake of [18F]FDOPA. [18F]CFT can also be used for imaging the norepinephrine transporter (NET) because of the specific uptake into the locus coeruleus. The observation that [18F]CFT exhibits specific uptake in the pancreas warrants further studies in humans with respect to potential utility in pancreatic imaging

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mitochondria are present in all eukaryotic cells. They enable these cells utilize oxygen in the production of adenosine triphosphate in the oxidative phosphorylation system, the mitochondrial respiratory chain. The concept mitochondrial disease conventionally refers to disorders of the respiratory chain that lead to oxidative phosphorylation defect. Mitochondrial disease in humans can present at any age, and practically in any organ system. Mitochondrial disease can be inherited in maternal, autosomal dominant, autosomal recessive, or X-chromosomal fashion. One of the most common molecular etiologies of mitochondrial disease in population is the m.3243A>G mutation in the MT-TL1 gene, encoding mitochondrial tRNALeu(UUR). Clinical evaluation of patients with m.3243A>G has revealed various typical clinical features, such as stroke-like episodes, diabetes mellitus and sensorineural hearing loss. The prevalence and clinical characteristics of mitochondrial disease in population are not well known. This thesis consists of a series of studies, in which the prevalence and characteristics of mitochondrial disease in the adult population of Southwestern Finland were assessed. Mitochondrial haplogroup Uk was associated with increased risk of occipital ischemic stroke among young women. Large-scale mitochondrial DNA deletions and mutations of the POLG1 gene were the most common molecular etiologies of progressive external ophthalmoplegia. Around 1% of diabetes mellitus emerging between the ages 18 45 years was associated with the m.3243A>G mutation. Moreover, among these young diabetic patients, mitochondrial haplogroup U was associated with maternal family history of diabetes. These studies demonstrate the usefulness of carefully planned molecular epidemiological investigations in the study of mitochondrial disorders.