949 resultados para High Reliability


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Com este trabalho, pretendemos descrever a construção da Bateria de Avaliação da Dislexia de Desenvolvimento (BADD), caracterizá-la metricamente, apresentar e discutir os resultados. Este instrumento de avaliação da dislexia foi aplicado a 555 crianças portuguesas, com idades compreendidas entre os 7 e os 12 anos de idade. Analisamos os processos cognitivos implicados na aprendizagem da leitura e escrita e aqueles que se encontram afectados em crianças com dislexia de desenvolvimento, nomeadamente a consciência fonológica, memória fonológica de trabalho, leitura e velocidade, escrita sob ditado, cálculo matemático, compreensão de frases, memória de curto e longo prazo e sequências. Foram assim comparadas as pontuações totais de acertos por teste entre crianças normoléxicas e crianças disléxicas, no sentido de verificar em que testes estes se diferenciam e, neste sentido, constituir um conjunto de testes que permitam uma avaliação da dislexia de desenvolvimento. Através da análise dos resultados ao nível da consistência interna do instrumento, verificamos que esta bateria de testes apresenta uma consistência elevada, aumentando após a exclusão do item Teste de Velocidade de Leitura, tempo, que será considerado como item isolado e utilizado à parte da bateria. Outro dos objectivos deste estudo foi o de reforçar a hipótese originalmente colocada de que a performance dos disléxicos nestes testes seria claramente inferior à do grupo controlo, permitindo desta forma diferenciar os dois grupos. Neste sentido, podemos concluir que a validação de uma bateria nestes moldes vem reforçar a importância de testes psicométricos como um dos elementos de uma avaliação psicológica, tornando-se fundamental para uma avaliação atempada e coerente com o quadro teórico da dislexia de desenvolvimento.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work is about the combination of functional ferroelectric oxides with Multiwall Carbon Nanotubes for microelectronic applications, as for example potential 3 Dimensional (3D) Non Volatile Ferroelectric Random Access Memories (NVFeRAM). Miniaturized electronics are ubiquitous now. The drive to downsize electronics has been spurred by needs of more performance into smaller packages at lower costs. But the trend of electronics miniaturization challenges board assembly materials, processes, and reliability. Semiconductor device and integrated circuit technology, coupled with its associated electronic packaging, forms the backbone of high-performance miniaturized electronic systems. However, as size decreases and functionalization increases in the modern electronics further size reduction is getting difficult; below a size limit the signal reliability and device performance deteriorate. Hence miniaturization of siliconbased electronics has limitations. On this background the Road Map for Semiconductor Industry (ITRS) suggests since 2011 alternative technologies, designated as More than Moore; being one of them based on carbon (carbon nanotubes (CNTs) and graphene) [1]. CNTs with their unique performance and three dimensionality at the nano-scale have been regarded as promising elements for miniaturized electronics [2]. CNTs are tubular in geometry and possess a unique set of properties, including ballistic electron transportation and a huge current caring capacity, which make them of great interest for future microelectronics [2]. Indeed CNTs might have a key role in the miniaturization of Non Volatile Ferroelectric Random Access Memories (NVFeRAM). Moving from a traditional two dimensional (2D) design (as is the case of thin films) to a 3D structure (based on a tridimensional arrangement of unidimensional structures) will result in the high reliability and sensing of the signals due to the large contribution from the bottom electrode. One way to achieve this 3D design is by using CNTs. Ferroelectrics (FE) are spontaneously polarized and can have high dielectric constants and interesting pyroelectric, piezoelectric, and electrooptic properties, being a key application of FE electronic memories. However, combining CNTs with FE functional oxides is challenging. It starts with materials compatibility, since crystallization temperature of FE and oxidation temperature of CNTs may overlap. In this case low temperature processing of FE is fundamental. Within this context in this work a systematic study on the fabrication of CNTs - FE structures using low cost low temperature methods was carried out. The FE under study are comprised of lead zirconate titanate (Pb1-xZrxTiO3, PZT), barium titanate (BaTiO3, BT) and bismuth ferrite (BiFeO3, BFO). The various aspects related to the fabrication, such as effect on thermal stability of MWCNTs, FE phase formation in presence of MWCNTs and interfaces between the CNTs/FE are addressed in this work. The ferroelectric response locally measured by Piezoresponse Force Microscopy (PFM) clearly evidenced that even at low processing temperatures FE on CNTs retain its ferroelectric nature. The work started by verifying the thermal decomposition behavior under different conditions of the multiwall CNTs (MWCNTs) used in this work. It was verified that purified MWCNTs are stable up to 420 ºC in air, as no weight loss occurs under non isothermal conditions, but morphology changes were observed for isothermal conditions at 400 ºC by Raman spectroscopy and Transmission Electron Microscopy (TEM). In oxygen-rich atmosphere MWCNTs started to oxidized at 200 ºC. However in argon-rich one and under a high heating rate MWCNTs remain stable up to 1300 ºC with a minimum sublimation. The activation energy for the decomposition of MWCNTs in air was calculated to lie between 80 and 108 kJ/mol. These results are relevant for the fabrication of MWCNTs – FE structures. Indeed we demonstrate that PZT can be deposited by sol gel at low temperatures on MWCNTs. And particularly interesting we prove that MWCNTs decrease the temperature and time for formation of PZT by ~100 ºC commensurate with a decrease in activation energy from 68±15 kJ/mol to 27±2 kJ/mol. As a consequence, monophasic PZT was obtained at 575 ºC for MWCNTs - PZT whereas for pure PZT traces of pyrochlore were still present at 650 ºC, where PZT phase formed due to homogeneous nucleation. The piezoelectric nature of MWCNTs - PZT synthesised at 500 ºC for 1 h was proved by PFM. In the continuation of this work we developed a low cost methodology of coating MWCNTs using a hybrid sol-gel / hydrothermal method. In this case the FE used as a proof of concept was BT. BT is a well-known lead free perovskite used in many microelectronic applications. However, synthesis by solid state reaction is typically performed around 1100 to 1300 ºC what jeopardizes the combination with MWCNTs. We also illustrate the ineffectiveness of conventional hydrothermal synthesis in this process due the formation of carbonates, namely BaCO3. The grown MWCNTs - BT structures are ferroelectric and exhibit an electromechanical response (15 pm/V). These results have broad implications since this strategy can also be extended to other compounds of materials with high crystallization temperatures. In addition the coverage of MWCNTs with FE can be optimized, in this case with non covalent functionalization of the tubes, namely with sodium dodecyl sulfate (SDS). MWCNTs were used as templates to grow, in this case single phase multiferroic BFO nanorods. This work shows that the use of nitric solvent results in severe damages of the MWCNTs layers that results in the early oxidation of the tubes during the annealing treatment. It was also observed that the use of nitric solvent results in the partial filling of MWCNTs with BFO due to the low surface tension (<119 mN/m) of the nitric solution. The opening of the caps and filling of the tubes occurs simultaneously during the refluxing step. Furthermore we verified that MWCNTs have a critical role in the fabrication of monophasic BFO; i.e. the oxidation of CNTs during the annealing process causes an oxygen deficient atmosphere that restrains the formation of Bi2O3 and monophasic BFO can be obtained. The morphology of the obtained BFO nano structures indicates that MWCNTs act as template to grow 1D structure of BFO. Magnetic measurements on these BFO nanostructures revealed a week ferromagnetic hysteresis loop with a coercive field of 956 Oe at 5 K. We also exploited the possible use of vertically-aligned multiwall carbon nanotubes (VA-MWCNTs) as bottom electrodes for microelectronics, for example for memory applications. As a proof of concept BiFeO3 (BFO) films were in-situ deposited on the surface of VA-MWCNTs by RF (Radio Frequency) magnetron sputtering. For in situ deposition temperature of 400 ºC and deposition time up to 2 h, BFO films cover the VA-MWCNTs and no damage occurs either in the film or MWCNTs. In spite of the macroscopic lossy polarization behaviour, the ferroelectric nature, domain structure and switching of these conformal BFO films was verified by PFM. A week ferromagnetic ordering loop was proved for BFO films on VA-MWCNTs having a coercive field of 700 Oe. Our systematic work is a significant step forward in the development of 3D memory cells; it clearly demonstrates that CNTs can be combined with FE oxides and can be used, for example, as the next 3D generation of FERAMs, not excluding however other different applications in microelectronics.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tese de doutoramento, Medicina Dentária (Dentisteria Conservadora), Universidade de Lisboa, Faculdade de Medicina Dentária, 2016

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dissertação para a obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Energia/Automação e Eletrónica Industrial

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The quantitative component of this study examined the effect of computerassisted instruction (CAI) on science problem-solving performance, as well as the significance of logical reasoning ability to this relationship. I had the dual role of researcher and teacher, as I conducted the study with 84 grade seven students to whom I simultaneously taught science on a rotary-basis. A two-treatment research design using this sample of convenience allowed for a comparison between the problem-solving performance of a CAI treatment group (n = 46) versus a laboratory-based control group (n = 38). Science problem-solving performance was measured by a pretest and posttest that I developed for this study. The validity of these tests was addressed through critical discussions with faculty members, colleagues, as well as through feedback gained in a pilot study. High reliability was revealed between the pretest and the posttest; in this way, students who tended to score high on the pretest also tended to score high on the posttest. Interrater reliability was found to be high for 30 randomly-selected test responses which were scored independently by two raters (i.e., myself and my faculty advisor). Results indicated that the form of computer-assisted instruction (CAI) used in this study did not significantly improve students' problem-solving performance. Logical reasoning ability was measured by an abbreviated version of the Group Assessment of Lx)gical Thinking (GALT). Logical reasoning ability was found to be correlated to problem-solving performance in that, students with high logical reasoning ability tended to do better on the problem-solving tests and vice versa. However, no significant difference was observed in problem-solving improvement, in the laboratory-based instruction group versus the CAI group, for students varying in level of logical reasoning ability.Insignificant trends were noted in results obtained from students of high logical reasoning ability, but require further study. It was acknowledged that conclusions drawn from the quantitative component of this study were limited, as further modifications of the tests were recommended, as well as the use of a larger sample size. The purpose of the qualitative component of the study was to provide a detailed description ofmy thesis research process as a Brock University Master of Education student. My research journal notes served as the data base for open coding analysis. This analysis revealed six main themes which best described my research experience: research interests, practical considerations, research design, research analysis, development of the problem-solving tests, and scoring scheme development. These important areas ofmy thesis research experience were recounted in the form of a personal narrative. It was noted that the research process was a form of problem solving in itself, as I made use of several problem-solving strategies to achieve desired thesis outcomes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Deux thématiques importantes des technologies de la santé: la pratique médicale fondée sur des preuves probantes et l’évaluation des interventions en médecine sont fondées sur une approche positiviste et une conception mécaniste des organisations en santé. Dans ce mémoire, nous soulevons l’hypothèse selon laquelle les théories de la complexité et la systémique permettent une conceptualisation différente de ces deux aspects de la gouvernance clinique d’une unité de Soins Intensifs Chirurgicaux (SIC), qui est considérée comme un système adaptatif dynamique non linéaire qui nécessite une approche systémique de la cognition. L’étude de cas d’une unité de SIC, permet de démontrer par de nombreux exemples et des analyses de micro-situations, toutes les caractéristiques de la complexité des patients critiques et instables et de la structure organisationnelle des SIC. Après une critique épistémologique de l’Evidence-Based Medicine nous proposons une pratique fondée sur des raisonnements cliniques alliant l’abduction, l’herméneutique et la systémique aux SIC. En nous inspirant des travaux de Karl Weick, nous suggérons aussi de repenser l’évaluation des modes d’interventions cliniques en s’inspirant de la notion d’organisation de haute fiabilité pour mettre en place les conditions nécessaires à l’amélioration des pratiques aux SIC.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

So far, in the bivariate set up, the analysis of lifetime (failure time) data with multiple causes of failure is done by treating each cause of failure separately. with failures from other causes considered as independent censoring. This approach is unrealistic in many situations. For example, in the analysis of mortality data on married couples one would be interested to compare the hazards for the same cause of death as well as to check whether death due to one cause is more important for the partners’ risk of death from other causes. In reliability analysis. one often has systems with more than one component and many systems. subsystems and components have more than one cause of failure. Design of high-reliability systems generally requires that the individual system components have extremely high reliability even after long periods of time. Knowledge of the failure behaviour of a component can lead to savings in its cost of production and maintenance and. in some cases, to the preservation of human life. For the purpose of improving reliability. it is necessary to identify the cause of failure down to the component level. By treating each cause of failure separately with failures from other causes considered as independent censoring, the analysis of lifetime data would be incomplete. Motivated by this. we introduce a new approach for the analysis of bivariate competing risk data using the bivariate vector hazard rate of Johnson and Kotz (1975).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cancer treatment is most effective when it is detected early and the progress in treatment will be closely related to the ability to reduce the proportion of misses in the cancer detection task. The effectiveness of algorithms for detecting cancers can be greatly increased if these algorithms work synergistically with those for characterizing normal mammograms. This research work combines computerized image analysis techniques and neural networks to separate out some fraction of the normal mammograms with extremely high reliability, based on normal tissue identification and removal. The presence of clustered microcalcifications is one of the most important and sometimes the only sign of cancer on a mammogram. 60% to 70% of non-palpable breast carcinoma demonstrates microcalcifications on mammograms [44], [45], [46].WT based techniques are applied on the remaining mammograms, those are obviously abnormal, to detect possible microcalcifications. The goal of this work is to improve the detection performance and throughput of screening-mammography, thus providing a ‘second opinion ‘ to the radiologists. The state-of- the- art DWT computation algorithms are not suitable for practical applications with memory and delay constraints, as it is not a block transfonn. Hence in this work, the development of a Block DWT (BDWT) computational structure having low processing memory requirement has also been taken up.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The challenge of reducing carbon emission and achieving emission target until 2050, has become a key development strategy of energy distribution for each country. The automotive industries, as the important portion of implementing energy requirements, are making some related researches to meet energy requirements and customer requirements. For modern energy requirements, it should be clean, green and renewable. For customer requirements, it should be economic, reliable and long life time. Regarding increasing requirements on the market and enlarged customer quantity, EVs and PHEV are more and more important for automotive manufactures. Normally for EVs and PHEV there are two important key parts, which are battery package and power electronics composing of critical components. A rechargeable battery is a quite important element for achieving cost competitiveness, which is mainly used to story energy and provide continue energy to drive an electric motor. In order to recharge battery and drive the electric motor, power electronics group is an essential bridge to convert different energy types for both of them. In modern power electronics there are many different topologies such as non-isolated and isolated power converters which can be used to implement for charging battery. One of most used converter topology is multiphase interleaved power converter, pri- marily due to its prominent advantages, which is frequently employed to obtain optimal dynamic response, high effciency and compact converter size. Concerning its usage, many detailed investigations regarding topology, control strategy and devices have been done. In this thesis, the core research is to investigate some branched contents in term of issues analysis and optimization approaches of building magnetic component. This work starts with an introduction of reasons of developing EVs and PEHV and an overview of different possible topologies regarding specific application requirements. Because of less components, high reliability, high effciency and also no special safety requirement, non-isolated multiphase interleaved converter is selected as the basic research topology of founded W-charge project for investigating its advantages and potential branches on using optimized magnetic components. Following, all those proposed aspects and approaches are investigated and analyzed in details in order to verify constrains and advantages through using integrated coupled inductors. Furthermore, digital controller concept and a novel tapped-inductor topology is proposed for multiphase power converter and electric vehicle application.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Durante la crisis financiera global de 2008 muchas organizaciones y mercados financieros tuvieron que terminar sus operaciones o replantearlas debido a los choques que golpearon el bienestar de sus empresas. A pesar de esta grave situación, en la actualidad se pueden encontrar empresas que se recuperaron y salieron del terrible panorama que les presentó la crisis, incluso encontrando nuevas oportunidades de negocio y fortaleciendo su futuro. Esta capacidad que algunas organizaciones tuvieron y que permitió su salida victoriosa de la crisis se denomina resiliencia, la cual es la capacidad de sobreponerse a los efectos negativos de choques internos o externos (Briguglio, Cordina, Farrugia & Vella 2009). Por tanto en el presente trabajo se estudiará esta capacidad tanto en la organización como en los líderes para hallar factores que mejoren el desempeño de las empresas en crisis como la que ocurrió en el 2008 – 2009. Primero se realizará un estudio sobre los sucesos y el desarrollo de la crisis subprime del año 2008 para tener un entendimiento claro de sus antecedentes, desarrollo, magnitud y consecuencias. Posteriormente se realizará un estudio profundo sobre la teoría de la resiliencia organizacional y la resiliencia en el líder como individuo y los estilos de liderazgo. Finalmente teniendo un sustento teórico tanto de la crisis como del concepto de resiliencia se tomarán casos de estudio de empresas que lograron perdurar en la crisis financiera del 2008 y empresas que no lograron sobrevivir para posteriormente hallar características del líder y del liderazgo que puedan aumentar o afectar la capacidad de resiliencia de las organizaciones con el objetivo de brindar herramientas a los líderes actuales para que manejen de forma eficiente y eficaz las empresas en un mundo complejo y variable como el actual.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the formative research scheme, the author built a psychological scale, with the purpose of analyzing psychological characteristics of the School of Psychology students. The instrument showed a high reliability coefficient (α=0,86), the exploratory factorial analysis found five variables which allowed to predict some aspects of sex and career level through a logistic regression; the ALSCAL showed two emotional dimensions. The author recommends to improve the instrument and to continue its application in the School. It is also necessary to develop qualitative research to acquire new data about the topic of the research.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Este estudio exploró las propiedades psicométricas de la Prueba de Procesamiento Fonológico de Lara, Serra, Aguilar y Flórez (2005) en una muestra de 478 niños de 4 a 7 años de edad en Bogotá y Chía en varios niveles socioeconómicos. Se analizó con pruebas de dificultad, discriminación, matriz de relaciones tetracóricas, coeficiente Alfa de Cronbach y análisis de componentes principales con rotación varimax para validez de constructo. Los resultados mostraron, en conciencia fonológica, un buen rendimiento menos en cuatro ítems, correspondencia del análisis de factores con la división es subescalas y niveles, y alta confiabilidad con bajo nivel de discriminación en la subescala de memoria fonológica. Los resultados se discutieron con base en los componentes de la habilidad de procesamiento fonológico.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Existen importantes pruebas de valoración que miden habilidades o competencias motoras en el niño; a pesar de ello Colombia carece de estudios que demuestren la validez y la confiabilidad de un test de medición que permita emitir un juicio valorativo relacionado con las competencias motoras infantiles, teniendo presente que la intervención debe basarse en la rigurosidad que exigen los procesos de valoración y evaluación del movimiento corporal. Objetivo. El presente estudio se centró en determinar las propiedades psicométricas del test de competencias motoras Bruininiks Oseretsky –BOT 2- segunda edición. Materiales y métodos. Se realizó una evaluación de pruebas diagnósticas con 24 niños aparentemente sanos de ambos géneros, entre 4 y 7 años, residentes en las ciudades de Chía y Bogotá. La evaluación fue realizada por 3 evaluadores expertos; el análisis para consistencia interna se realizó utilizando el Coeficiente Alfa de Cronbach, el análisis de reproducibilidad se estableció a través del Coeficiente de Correlación Intraclase –CCI- y para el análisis de la validez concurrente se utilizó el Coeficiente de Correlación de Pearson, considerando un alfa=0.05. Resultados. Para la totalidad de las pruebas, se encontraron altos índices de confiabilidad y validez. Conclusiones. El BOT 2 es un instrumento válido y confiable, que puede ser utilizado para la evaluación e identificación del nivel de desarrollo en que se encuentran las competencias motoras en el niño.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Viljan att hålla en hög kvalitet på den kod som skrivs vid utveckling av system och applikationerär inte något nytt i utvecklingsvärlden. Flera större företag använder sig av olika mått för attmäta kvaliteten på koden i sina system med målet att hålla en hög driftsäkerhet.Trafikverket är en statlig myndighet som ansvarar för driften av bland annat de system somhåller igång Sveriges järnvägsnät. Eftersom systemen fyller en viktig del i att säkra driften ochse till att tågpositioner, planering av avgångar och hantering av driftstörningar fungerar dygnetrunt för hela landet anser de att det är viktigt att sträva efter att hålla en hög kvalitet påsystemen.Syftet med det här examensarbetet var att ta reda på vilka mått som kan vara möjliga attanvända under systemutvecklingsprocessen för att mäta kvaliteten på kod och hur måtten kananvändas för att öka kvaliteten på IT-lösningar. Detta för att redan på ett tidigt stadie kunnamäta kvaliteten på den kod som skrivs i både befintliga och nyutvecklade system.Studien är en fallstudie som utfördes på Trafikverket, de olika måtten som undersöktes varcode coverage, nivån på maintainability index och antalet inrapporterade incidenter för varjesystem. Mätningar utfördes på sju av Trafikverkets system som i analysen jämfördes motantalet rapporterade incidenter. Intervjuer utfördes för att ge en bild över hur arbetssättet vidutveckling kan påverka kvaliteten. Genom litteraturstudier kom det fram ett mått som inte kundeanvändas praktiskt i det här fallet men är högst intressant, detta är cyclomatic complexity somfinns som en del av maintainability index men som även separat påverkar möjligheten att skrivaenhetstest.Resultaten av studien visar att måtten är användbara för ändamålet men bör inte användassom enskilda mått för att mäta kvalitet eftersom de fyller olika funktioner. Det är viktigt attarbetssättet runt utveckling genomförs enligt en tydlig struktur och att utvecklarna både harkunskap om hur man arbetar med enhetstest och följer kodprinciper för strukturen. Tydligakopplingar mellan nivån på code coverage och inflödet av incidenter kunde ses i de undersöktasystemen där hög code coverage ger ett lägre inflöde av incidenter. Ingen korrelation mellanmaintainability index och incidenter kunde hittas.