956 resultados para Validation model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this thesis was to develop a model, which can predict heat transfer, heat release distribution and vertical temperature profile of gas phase in the furnace of a bubbling fluidized bed (BFB) boiler. The model is based on three separate model components that take care of heat transfer, heat release distribution and mass and energy balance calculations taking into account the boiler design and operating conditions. The model was successfully validated by solving the model parameters on the basis of commercial size BFB boiler test run information and by performing parametric studies with the model. Implementation of the developed model for the Foster Wheeler BFB design procedures will require model validation with existing BFB database and possibly more detailed measurements at the commercial size BFB boilers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Syndrome de stress scolaire chronique, le bumout de l'élève ou bumout scolaire suscite un intérêt grandissant mais ses déterminants sont encore peu connus. De plus, ce phénomène est rarement étudié chez les adolescents francophones et aucune recherche n'a encore été menée en Suisse. Par conséquent, au travers de ce travail de thèse, nous proposons d'étendre la recherche sur le bumout scolaire aux adolescents de Suisse francophone et d'apporter des précisions sur ses facteurs de risque ou de protection. Pour ce faire, nous avons mené deux recherches empiriques impliquant 861 adolescents âgés de 14 à 18 ans et scolarisés en Suisse francophone. Ces adolescents ont répondu à une série d'échelles évaluant notamment le burnout scolaire, le stress scolaire, le soutien social, la consommation de substances et le parcours scolaire. Les résultats montrent tout d'abord que l'inventaire de Burnout Scolaire, version française du School Burnout lnventory, est un outil fiable et valide. Ensuite, il apparaît que le burnout scolaire touche jusqu'à 24% des adolescents de Suisse francophone et que ce dernier se caractérise par une perte d'intérêt pour l'école, une grande remise en question du sens du travail scolaire ainsi qu'un sentiment élevé d'insuffisance à l'école. Il apparaît également que le stress scolaire lié au succès et à l'avenir scolaire augmente le risque de bumout alors que le soutien des parents et des enseignants le diminue. Par ailleurs, nous mettons en évidence que l'effet du soutien social sur le burnout scolaire est médiatisé par le stress scolaire, ce qui souligne d'autant plus le rôle protecteur du soutien social. Nos résultats montrent également que les niveaux de bumout scolaire varient en fonction, d'une part de certaines caractéristiques du contexte scolaire et d'autre part en fonction de la sévérité de la consommation de substances des adolescents. Enfin, les connaissances accumulées dans ce travail et leur mise en perspective dans un modèle d'intervention précoce permettent d'insister sur le rôle de l'école et des professionnels de l'école dans la prévention du burnout scolaire. -- Syndrome of chronic school stress, pupil 's bumout or school bumout is of growing interest. However, little is known about its determinants. Moreover, this phenomenon is rarely studied in French speaking adolescents and no research has yet been conducted in Switzerland. Therefore, through this thesis, we propose to extend the research on school bumout to Swiss French speaking adolescents and to clarify its risk and protective factors. To achieve this, we conducted two empirical research involving 861 adolescents aged 14 to 18 and enrolled in the French part of Switzerland. These adolescents were asked to answer a questionnaire about school bumout, academic stress, social support, substance use and schooling. Results first show, that the French version of the School Bumout Inventory is a reliable and valid tool. lt then appears that school bumout affects up to 24% of adolescents in the French speaking part of Switzerland and that this phenomenon is characterized by a loss of interest in school, a great challenge to the sense of school work and a high sense of insufissance school. lt also appears that stress related to school success and academic future increases the risk of bumout while parents and teachers support decreases it. Moreover, we highlight that the effect of social support on school bumout is mediated by school stress, which further underscores the protective role of social support. Our results also show that school bumout levels vary depending on characteristics of the school context and on the severity of substance use of adolecents. Finally, the knowledge accumulated in this work and putting it onto perspective within early intervention model enable to insist on the role of school and school professionals in the prevention of school bumout

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the development of integrated circuit technology continues to follow Moore’s law the complexity of circuits increases exponentially. Traditional hardware description languages such as VHDL and Verilog are no longer powerful enough to cope with this level of complexity and do not provide facilities for hardware/software codesign. Languages such as SystemC are intended to solve these problems by combining the powerful expression of high level programming languages and hardware oriented facilities of hardware description languages. To fully replace older languages in the desing flow of digital systems SystemC should also be synthesizable. The devices required by modern high speed networks often share the same tight constraints for e.g. size, power consumption and price with embedded systems but have also very demanding real time and quality of service requirements that are difficult to satisfy with general purpose processors. Dedicated hardware blocks of an application specific instruction set processor are one way to combine fast processing speed, energy efficiency, flexibility and relatively low time-to-market. Common features can be identified in the network processing domain making it possible to develop specialized but configurable processor architectures. One such architecture is the TACO which is based on transport triggered architecture. The architecture offers a high degree of parallelism and modularity and greatly simplified instruction decoding. For this M.Sc.(Tech) thesis, a simulation environment for the TACO architecture was developed with SystemC 2.2 using an old version written with SystemC 1.0 as a starting point. The environment enables rapid design space exploration by providing facilities for hw/sw codesign and simulation and an extendable library of automatically configured reusable hardware blocks. Other topics that are covered are the differences between SystemC 1.0 and 2.2 from the viewpoint of hardware modeling, and compilation of a SystemC model into synthesizable VHDL with Celoxica Agility SystemC Compiler. A simulation model for a processor for TCP/IP packet validation was designed and tested as a test case for the environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study main purpose was the validation of both French and German versions of a Perceived Neighborhood Social Cohesion Questionnaire. The sample group comprised 5065 Swiss men from the "Cohort Study on Substance Use Risk Factors." Multigroup Confirmatory factor analysis showed that a three-factor model fits the data well, which substantiates the generalizability of Perceived Neighborhood Social Cohesion Questionnaire factor structure, regardless of the language. The Perceived Neighborhood Social Cohesion Questionnaire demonstrated excellent homogeneity (α = 95) and split-half reliability (r = .96). The Perceived Neighborhood Social Cohesion Questionnaire was sensitive to community size and participants' financial situation, confirming that it also measures real social conditions. Finally, weak but frequent correlations between Perceived Neighborhood Social Cohesion Questionnaire and alcohol, cigarette, and cannabis dependence were measured.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have designed and validated a novel generic platform for production of tetravalent IgG1-like chimeric bispecific Abs. The VH-CH1-hinge domains of mAb2 are fused through a peptidic linker to the N terminus of mAb1 H chain, and paired mutations at the CH1-CL interface mAb1 are introduced that force the correct pairing of the two different free L chains. Two different sets of these CH1-CL interface mutations, called CR3 and MUT4, were designed and tested, and prototypic bispecific Abs directed against CD5 and HLA-DR were produced (CD5xDR). Two different hinge sequences between mAb1 and mAb2 were also tested in the CD5xDR-CR3 or -MUT4 background, leading to bispecific Ab (BsAbs) with a more rigid or flexible structure. All four Abs produced bound with good specificity and affinity to CD5 and HLA-DR present either on the same target or on different cells. Indeed, the BsAbs were able to efficiently redirect killing of HLA-DR(+) leukemic cells by human CD5(+) cytokine-induced killer T cells. Finally, all BsAbs had a functional Fc, as shown by their capacity to activate human complement and NK cells and to mediate phagocytosis. CD5xDR-CR3 was chosen as the best format because it had overall the highest functional activity and was very stable in vitro in both neutral buffer and in serum. In vivo, CD5xDR-CR3 was shown to have significant therapeutic activity in a xenograft model of human leukemia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Validation and verification operations encounter various challenges in product development process. Requirements for increasing the development cycle pace set new requests for component development process. Verification and validation usually represent the largest activities, up to 40 50 % of R&D resources utilized. This research studies validation and verification as part of case company's component development process. The target is to define framework that can be used in improvement of the validation and verification capability evaluation and development in display module development projects. Validation and verification definition and background is studied in this research. Additionally, theories such as project management, system, organisational learning and causality is studied. Framework and key findings of this research are presented. Feedback system according of the framework is defined and implemented to the case company. This research is divided to the theory and empirical parts. Theory part is conducted in literature review. Empirical part is done in case study. Constructive methode and design research methode are used in this research A framework for capability evaluation and development was defined and developed as result of this research. Key findings of this study were that double loop learning approach with validation and verification V+ model enables defining a feedback reporting solution. Additional results, some minor changes in validation and verification process were proposed. There are a few concerns expressed on the results on validity and reliability of this study. The most important one was the selected research method and the selected model itself. The final state can be normative, the researcher may set study results before the actual study and in the initial state, the researcher may describe expectations for the study. Finally reliability of this study, and validity of this work are studied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: There is a need for short, specific instruments that assess quality of life (QOL) adequately in the older adult population. The aims of the present study were to obtain evidence on the validity of the inferences that could be drawn from an instrument to measure QOL in the aging population (people 50+ years old), and to test its psychometric properties. METHODS: The instrument, WHOQOL-AGE, comprised 13 positive items, assessed on a five-point rating scale, and was administered to nationally representative samples (n = 9987) from Finland, Poland, and Spain. Cronbach's alpha was employed to assess internal consistency reliability, whereas the validity of the questionnaire was assessed by means of factor analysis, graded response model, Pearson's correlation coefficient and unpaired t-test. Normative values were calculated across countries and for different age groups. RESULTS: The satisfactory goodness-of-fit indices confirmed that the factorial structure of WHOQOL-AGE comprises two first-order factors. Cronbach's alpha was 0.88 for factor 1, and 0.84 for factor 2. Evidence supporting a global score was found with a second-order factor model, according to the goodness-of-fit indices: CFI = 0.93, TLI = 0.91, RMSEA = 0.073. Convergent validity was estimated at r = 0.75 and adequate discriminant validity was also found. Significant differences were found between healthy individuals (74.19 ± 13.21) and individuals with at least one chronic condition (64.29 ± 16.29), supporting adequate known-groups validity. CONCLUSIONS: WHOQOL-AGE has shown good psychometric properties in Finland, Poland, and Spain. Therefore, considerable support is provided to using the WHOQOL-AGE to measure QOL in older adults in these countries, and to compare the QOL of older and younger adults.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scarcity of long-term series of sediment-related variables has led watershed managers to apply mathematical models to simulate sediment fluxes. Due to the high efforts for installation and maintenance of sedimentological gauges, tracers have been pointed out as an alternative to validate soil redistribution modelling. In this study, the 137Cs technique was used to assess the WASA-SED model performance at the Benguê watershed (933 km²), in the Brazilian semiarid. Qualitatively, good agreement was found among the 137Cs technique and the WASA-SED model results. Nonetheless, quantitatively great differences, up to two orders of magnitude, were found between the two methods. Among the uncertainties inherent to the 137Cs technique, definition of the reference inventory seems to be a major source of imprecision. In addition, estimations of water and sediment fluxes with mathematical models usually also present high uncertainty, contributing to the quantitative differences of the soil redistribution estimates with the two methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hydrological models are important tools that have been used in water resource planning and management. Thus, the aim of this work was to calibrate and validate in a daily time scale, the SWAT model (Soil and Water Assessment Tool) to the watershed of the Galo creek , located in Espírito Santo State. To conduct the study we used georeferenced maps of relief, soil type and use, in addition to historical daily time series of basin climate and flow. In modeling were used time series corresponding to the periods Jan 1, 1995 to Dec 31, 2000 and Jan 1, 2001 to Dec 20, 2003 for calibration and validation, respectively. Model performance evaluation was done using the Nash-Sutcliffe coefficient (E NS) and the percentage of bias (P BIAS). SWAT evaluation was also done in the simulation of the following hydrological variables: maximum and minimum annual daily flowsand minimum reference flows, Q90 and Q95, based on mean absolute error. E NS and P BIAS were, respectively, 0.65 and 7.2% and 0.70 and 14.1%, for calibration and validation, indicating a satisfactory performance for the model. SWAT adequately simulated minimum annual daily flow and the reference flows, Q90 and Q95; it was not suitable in the simulation of maximum annual daily flows.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cells of epithelial origin, e.g. from breast and prostate cancers, effectively differentiate into complex multicellular structures when cultured in three-dimensions (3D) instead of conventional two-dimensional (2D) adherent surfaces. The spectrum of different organotypic morphologies is highly dependent on the culture environment that can be either non-adherent or scaffold-based. When embedded in physiological extracellular matrices (ECMs), such as laminin-rich basement membrane extracts, normal epithelial cells differentiate into acinar spheroids reminiscent of glandular ductal structures. Transformed cancer cells, in contrast, typically fail to undergo acinar morphogenic patterns, forming poorly differentiated or invasive multicellular structures. The 3D cancer spheroids are widely accepted to better recapitulate various tumorigenic processes and drug responses. So far, however, 3D models have been employed predominantly in the Academia, whereas the pharmaceutical industry has yet to adopt a more widely and routine use. This is mainly due to poor characterisation of cell models, lack of standardised workflows and high throughput cell culture platforms, and the availability of proper readout and quantification tools. In this thesis, a complete workflow has been established entailing well-characterised 3D cell culture models for prostate cancer, a standardised 3D cell culture routine based on high-throughput-ready platform, automated image acquisition with concomitant morphometric image analysis, and data visualisation, in order to enable large-scale high-content screens. Our integrated suite of software and statistical analysis tools were optimised and validated using a comprehensive panel of prostate cancer cell lines and 3D models. The tools quantify multiple key cancer-relevant morphological features, ranging from cancer cell invasion through multicellular differentiation to growth, and detect dynamic changes both in morphology and function, such as cell death and apoptosis, in response to experimental perturbations including RNA interference and small molecule inhibitors. Our panel of cell lines included many non-transformed and most currently available classic prostate cancer cell lines, which were characterised for their morphogenetic properties in 3D laminin-rich ECM. The phenotypes and gene expression profiles were evaluated concerning their relevance for pre-clinical drug discovery, disease modelling and basic research. In addition, a spontaneous model for invasive transformation was discovered, displaying a highdegree of epithelial plasticity. This plasticity is mediated by an abundant bioactive serum lipid, lysophosphatidic acid (LPA), and its receptor LPAR1. The invasive transformation was caused by abrupt cytoskeletal rearrangement through impaired G protein alpha 12/13 and RhoA/ROCK, and mediated by upregulated adenylyl cyclase/cyclic AMP (cAMP)/protein kinase A, and Rac/ PAK pathways. The spontaneous invasion model tangibly exemplifies the biological relevance of organotypic cell culture models. Overall, this thesis work underlines the power of novel morphometric screening tools in drug discovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The three alpha2-adrenoceptor (alpha2-AR) subtypes belong to the G protein-coupled receptor superfamily and represent potential drug targets. These receptors have many vital physiological functions, but their actions are complex and often oppose each other. Current research is therefore driven towards discovering drugs that selectively interact with a specific subtype. Cell model systems can be used to evaluate a chemical compound's activity in complex biological systems. The aim of this thesis was to optimize and validate cell-based model systems and assays to investigate alpha2-ARs as drug targets. The use of immortalized cell lines as model systems is firmly established but poses several problems, since the protein of interest is expressed in a foreign environment, and thus essential components of receptor regulation or signaling cascades might be missing. Careful cell model validation is thus required; this was exemplified by three different approaches. In cells heterologously expressing alpha2A-ARs, it was noted that the transfection technique affected the test outcome; false negative adenylyl cyclase test results were produced unless a cell population expressing receptors in a homogenous fashion was used. Recombinant alpha2C-ARs in non-neuronal cells were retained inside the cells, and not expressed in the cell membrane, complicating investigation of this receptor subtype. Receptor expression enhancing proteins (REEPs) were found to be neuronalspecific adapter proteins that regulate the processing of the alpha2C-AR, resulting in an increased level of total receptor expression. Current trends call for the use of primary cells endogenously expressing the receptor of interest; therefore, primary human vascular smooth muscle cells (SMC) expressing alpha2-ARs were tested in a functional assay monitoring contractility with a myosin light chain phosphorylation assay. However, these cells were not compatible with this assay due to the loss of differentiation. A rat aortic SMC cell line transfected to express the human alpha2B-AR was adapted for the assay, and it was found that the alpha2-AR agonist, dexmedetomidine, evoked myosin light chain phosphorylation in this model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis concentrates on the validation of a generic thermal hydraulic computer code TRACE under the challenges of the VVER-440 reactor type. The code capability to model the VVER-440 geometry and thermal hydraulic phenomena specific to this reactor design has been examined and demonstrated acceptable. The main challenge in VVER-440 thermal hydraulics appeared in the modelling of the horizontal steam generator. The major challenge here is not in the code physics or numerics but in the formulation of a representative nodalization structure. Another VVER-440 specialty, the hot leg loop seals, challenges the system codes functionally in general, but proved readily representable. Computer code models have to be validated against experiments to achieve confidence in code models. When new computer code is to be used for nuclear power plant safety analysis, it must first be validated against a large variety of different experiments. The validation process has to cover both the code itself and the code input. Uncertainties of different nature are identified in the different phases of the validation procedure and can even be quantified. This thesis presents a novel approach to the input model validation and uncertainty evaluation in the different stages of the computer code validation procedure. This thesis also demonstrates that in the safety analysis, there are inevitably significant uncertainties that are not statistically quantifiable; they need to be and can be addressed by other, less simplistic means, ultimately relying on the competence of the analysts and the capability of the community to support the experimental verification of analytical assumptions. This method completes essentially the commonly used uncertainty assessment methods, which are usually conducted using only statistical methods.