869 resultados para Validation and certification process


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The biological reactions during the settling and decant periods of Sequencing Batch Reactors (SBRs) are generally ignored as they are not easily measured or described by modelling approaches. However, important processes are taking place, and in particular when the influent is fed into the bottom of the reactor at the same time (one of the main features of the UniFed process), the inclusion of these stages is crucial for accurate process predictions. Due to the vertical stratification of both liquid and solid components, a one-dimensional hydraulic model is combined with a modified ASM2d biological model to allow the prediction of settling velocity, sludge concentration, soluble components and biological processes during the non-mixed periods of the SBR. The model is calibrated on a full-scale UniFed SBR system with tracer breakthrough tests, depth profiles of particulate and soluble compounds and measurements of the key components during the mixed aerobic period. This model is then validated against results from an independent experimental period with considerably different operating parameters. In both cases, the model is able to accurately predict the stratification and most of the biological reactions occurring in the sludge blanket and the supernatant during the non-mixed periods. Together with a correct description of the mixed aerobic period, a good prediction of the overall SBR performance can be achieved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn teoriaosuudessa tutkittiin prosessien uudelleen suunnittelua, prosessien mallintamista sekä prosessimittariston rakentamista. Työn tavoitteena oli uudelleen suunnitella organisaation sertifiointiprosessi. Tämän tavoitteen saavuttamiseksi piti mallintaa nykyinen ja uusi prosessi sekä rakentaa mittaristo, joka antaisi organisaatiolle arvokasta tietoa siitä, kuinka tehokkaasti uusi prosessi toimii. Työ suoritettiin osallistuvana toimintatutkimuksena. Diplomityön tekijä oli toiminut kohdeorganisaatiossa työntekijänä jo useita vuosia ja pystyi näinollen hyödyntämään omaa tietämystään sekä nykyisen prosessin mallintamisessa, että uuden prosessin suunnittelussa. Työn tuloksena syntyi uusi sertifiointiprosessi, joka on karsitumpi ja tehokkaampi kuin edeltäjänsä. Uusi mittaristojärjestelmä rakennettiin, jota organisaation johto kykenisi seuraamaan prosessin sidosryhmien tehokkuutta sekä tuotteiden laadun kehitystä. Sivutuotteena organisaatio sai käyttöönsä yksityiskohtaiset prosessikuvaukset, joita voidaan hyödyntää koulutusmateriaalina uutta henkilöstöä rekrytoitaessa sekä informatiivisena työkaluna esiteltäessä prosessia virallisille sertifiointitahoille.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Requirements Engineering has been acknowledged an essential discipline for Software Quality. Poorly-defined processes for eliciting, analyzing, specifying and validating requirements can lead to unclear issues or misunderstandings on business needs and project’s scope. These typically result in customers’ non-satisfaction with either the products’ quality or the increase of the project’s budget and duration. Maturity models allow an organization to measure the quality of its processes and improve them according to an evolutionary path based on levels. The Capability Maturity Model Integration (CMMI) addresses the aforementioned Requirements Engineering issues. CMMI defines a set of best practices for process improvement that are divided into several process areas. Requirements Management and Requirements Development are the process areas concerned with Requirements Engineering maturity. Altran Portugal is a consulting company concerned with the quality of its software. In 2012, the Solution Center department has developed and applied successfully a set of processes aligned with CMMI-DEV v1.3, what granted them a Level 2 maturity certification. For 2015, they defined an organizational goal of addressing CMMI-DEV maturity level 3. This MSc dissertation is part of this organization effort. In particular, it is concerned with the required process areas that address the activities of Requirements Engineering. Our main goal is to contribute for the development of Altran’s internal engineering processes to conform to the guidelines of the Requirements Development process area. Throughout this dissertation, we started with an evaluation method based on CMMI and conducted a compliance assessment of Altran’s current processes. This allowed demonstrating their alignment with the CMMI Requirements Management process area and to highlight the improvements needed to conform to the Requirements Development process area. Based on the study of alternative solutions for the gaps found, we proposed a new Requirements Management and Development process that was later validated using three different approaches. The main contribution of this dissertation is the new process developed for Altran Portugal. However, given that studies on these topics are not abundant in the literature, we also expect to contribute with useful evidences to the existing body of knowledge with a survey on CMMI and requirements engineering trends. Most importantly, we hope that the implementation of the proposed processes’ improvements will minimize the risks of mishandled requirements, increasing Altran’s performance and taking them one step further to the desired maturity level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Matrix effects, which represent an important issue in liquid chromatography coupled to mass spectrometry or tandem mass spectrometry detection, should be closely assessed during method development. In the case of quantitative analysis, the use of stable isotope-labelled internal standard with physico-chemical properties and ionization behaviour similar to the analyte is recommended. In this paper, an example of the choice of a co-eluting deuterated internal standard to compensate for short-term and long-term matrix effect in the case of chiral (R,S)-methadone plasma quantification is reported. The method was fully validated over a concentration range of 5-800 ng/mL for each methadone enantiomer with satisfactory relative bias (-1.0 to 1.0%), repeatability (0.9-4.9%) and intermediate precision (1.4-12.0%). From the results obtained during validation, a control chart process during 52 series of routine analysis was established using both intermediate precision standard deviation and FDA acceptance criteria. The results of routine quality control samples were generally included in the +/-15% variability around the target value and mainly in the two standard deviation interval illustrating the long-term stability of the method. The intermediate precision variability estimated in method validation was found to be coherent with the routine use of the method. During this period, 257 trough concentration and 54 peak concentration plasma samples of patients undergoing (R,S)-methadone treatment were successfully analysed for routine therapeutic drug monitoring.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Carbon isotope ratio of androgens in urine specimens is routinely determined to exclude an abuse of testosterone or testosterone prohormones by athletes. Increasing application of gas chromatography/combustion/isotope ratio mass spectrometry (GC/C/IRMS) in the last years for target and systematic investigations on samples has resulted in the demand for rapid sample throughput as well as high selectivity in the extraction process particularly in the case of conspicuous samples. For that purpose, we present herein the complimentary use of an SPE-based assay and an HPLC fractionation method as a two-stage strategy for the isolation of testosterone metabolites and endogenous reference compounds prior to GC/C/IRMS analyses. Assays validation demonstrated acceptable performance in terms of intermediate precision (range: 0.1-0.4 per thousand) and Bland-Altman analyses revealed no significant bias (0.2 per thousand). For further validation of this two-stage analyses strategy, all the specimens (n=124) collected during a major sport event were processed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The reverse transcription quantitative real-time polymerase chain reaction (RT-qPCR) is a widely used, highly sensitive laboratory technique to rapidly and easily detect, identify and quantify gene expression. Reliable RT-qPCR data necessitates accurate normalization with validated control genes (reference genes) whose expression is constant in all studied conditions. This stability has to be demonstrated.We performed a literature search for studies using quantitative or semi-quantitative PCR in the rat spared nerve injury (SNI) model of neuropathic pain to verify whether any reference genes had previously been validated. We then analyzed the stability over time of 7 commonly used reference genes in the nervous system - specifically in the spinal cord dorsal horn and the dorsal root ganglion (DRG). These were: Actin beta (Actb), Glyceraldehyde-3-phosphate dehydrogenase (GAPDH), ribosomal proteins 18S (18S), L13a (RPL13a) and L29 (RPL29), hypoxanthine phosphoribosyltransferase 1 (HPRT1) and hydroxymethylbilane synthase (HMBS). We compared the candidate genes and established a stability ranking using the geNorm algorithm. Finally, we assessed the number of reference genes necessary for accurate normalization in this neuropathic pain model. RESULTS: We found GAPDH, HMBS, Actb, HPRT1 and 18S cited as reference genes in literature on studies using the SNI model. Only HPRT1 and 18S had been once previously demonstrated as stable in RT-qPCR arrays. All the genes tested in this study, using the geNorm algorithm, presented gene stability values (M-value) acceptable enough for them to qualify as potential reference genes in both DRG and spinal cord. Using the coefficient of variation, 18S failed the 50% cut-off with a value of 61% in the DRG. The two most stable genes in the dorsal horn were RPL29 and RPL13a; in the DRG they were HPRT1 and Actb. Using a 0.15 cut-off for pairwise variations we found that any pair of stable reference gene was sufficient for the normalization process. CONCLUSIONS: In the rat SNI model, we validated and ranked Actb, RPL29, RPL13a, HMBS, GAPDH, HPRT1 and 18S as good reference genes in the spinal cord. In the DRG, 18S did not fulfill stability criteria. The combination of any two stable reference genes was sufficient to provide an accurate normalization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

False identity documents constitute a potential powerful source of forensic intelligence because they are essential elements of transnational crime and provide cover for organized crime. In previous work, a systematic profiling method using false documents' visual features has been built within a forensic intelligence model. In the current study, the comparison process and metrics lying at the heart of this profiling method are described and evaluated. This evaluation takes advantage of 347 false identity documents of four different types seized in two countries whose sources were known to be common or different (following police investigations and dismantling of counterfeit factories). Intra-source and inter-sources variations were evaluated through the computation of more than 7500 similarity scores. The profiling method could thus be validated and its performance assessed using two complementary approaches to measuring type I and type II error rates: a binary classification and the computation of likelihood ratios. Very low error rates were measured across the four document types, demonstrating the validity and robustness of the method to link documents to a common source or to differentiate them. These results pave the way for an operational implementation of a systematic profiling process integrated in a developed forensic intelligence model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Validation and verification operations encounter various challenges in product development process. Requirements for increasing the development cycle pace set new requests for component development process. Verification and validation usually represent the largest activities, up to 40 50 % of R&D resources utilized. This research studies validation and verification as part of case company's component development process. The target is to define framework that can be used in improvement of the validation and verification capability evaluation and development in display module development projects. Validation and verification definition and background is studied in this research. Additionally, theories such as project management, system, organisational learning and causality is studied. Framework and key findings of this research are presented. Feedback system according of the framework is defined and implemented to the case company. This research is divided to the theory and empirical parts. Theory part is conducted in literature review. Empirical part is done in case study. Constructive methode and design research methode are used in this research A framework for capability evaluation and development was defined and developed as result of this research. Key findings of this study were that double loop learning approach with validation and verification V+ model enables defining a feedback reporting solution. Additional results, some minor changes in validation and verification process were proposed. There are a few concerns expressed on the results on validity and reliability of this study. The most important one was the selected research method and the selected model itself. The final state can be normative, the researcher may set study results before the actual study and in the initial state, the researcher may describe expectations for the study. Finally reliability of this study, and validity of this work are studied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of efficient supply chain management has increased due to globalization and the blurring of organizational boundaries. Various supply chain management technologies have been identified to drive organizational profitability and financial performance. Organizations have historically been concentrating heavily on the flow of goods and services, while less attention has been dedicated to the flow of money. While supply chains are becoming more transparent and automated, new opportunities for financial supply chain management have emerged through information technology solutions and comprehensive financial supply chain management strategies. This research concentrates on the end part of the purchasing process which is the handling of invoices. Efficient invoice processing can have an impact on organizations working capital management and thus provide companies with better readiness to face the challenges related to cash management. Leveraging a process mining solution the aim of this research was to examine the automated invoice handling process of four different organizations. The invoice data was collected from each organizations invoice processing system. The sample included all the invoices organizations had processed during the year 2012. The main objective was to find out whether e-invoices are faster to process in an automated invoice processing solution than scanned invoices (post entry into invoice processing solution). Other objectives included looking into the longest lead times between process steps and the impact of manual process steps on cycle time. Processing of invoices from maverick purchases was also examined. Based on the results of the research and previous literature on the subject, suggestions for improving the process were proposed. The results of the research indicate that scanned invoices were processed faster than e-invoices. This is mostly due to the more complex processing of e-invoices. It should be noted however that the manual tasks related to turning a paper invoice into electronic format through scanning are ignored in this research. The transitions with the longest lead times in the invoice handling process included both pre-automated steps as well as manual steps performed by humans. When the most common manual steps were examined in more detail, it was clear that these steps had a prolonging impact on the process. Regarding invoices from maverick purchases the evidence shows that these invoices were slower to process than invoices from purchases conducted through e-procurement systems and from preferred suppliers. Suggestions on how to improve the process included: increasing invoice matching, reducing of manual steps and leveraging of different value added services such as invoice validation service, mobile solutions and supply chain financing services. For companies that have already reaped all the process efficiencies the next step is to engage in collaborative financial supply chain management strategies that can benefit the whole supply chain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Intracoronary application of BM-derived cells for the treatment of acute myocardial infarction (AMI) is currently being studied intensively. Simultaneously, strict legal requirements surround the production of cells for clinical studies. Thus good manufacturing practice (GMP)-compliant collection and preparation of BM for patients with AMI was established by the Cytonet group. METHODS: As well as fulfillment of standard GMP requirements, including a manufacturing license, validation of the preparation process and the final product was performed. Whole blood (n=6) and BM (n=3) validation samples were processed under GMP conditions by gelafundin or hydroxyethylstarch sedimentation in order to reduce erythrocytes/platelets and volume and to achieve specifications defined in advance. Special attention was paid to the free potassium (<6 mmol/L), some rheologically relevant cellular characteristics (hematocrit <0.45, platelets <450 x 10(6)/mL) and the sterility of the final product. RESULTS: The data were reviewed and GMP compliance was confirmed by the German authorities (Paul-Ehrlich Institute). Forty-five BM cell preparations for clinical use were carried out following the validated methodology and standards. Additionally three selections of CD34+ BM cells for infusion were performed. All specification limits were met. Discussion In conclusion, preparation of BM cells for intracoronary application is feasible under GMP conditions. As the results of sterility testing may not be available at the time of intracoronary application, the highest possible standards to avoid bacterial and other contaminations have to be applied. The increased expense of the GMP-compliant process can be justified by higher safety for patients and better control of the final product.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We discuss a framework for the application of abstract interpretation as an aid during program development, rather than in the more traditional application of program optimization. Program validation and detection of errors is first performed statically by comparing (partial) specifications written in terms of assertions against information obtained from (global) static analysis of the program. The results of this process are expressed in the user assertion language. Assertions (or parts of assertions) which cannot be checked statically are translated into run-time tests. The framework allows the use of assertions to be optional. It also allows using very general properties in assertions, beyond the predefined set understandable by the static analyzer and including properties defined by user programs. We also report briefly on an implementation of the framework. The resulting tool generates and checks assertions for Prolog, CLP(R), and CHIP/CLP(fd) programs, and integrates compile-time and run-time checking in a uniform way. The tool allows using properties such as types, modes, non-failure, determinacy, and computational cost, and can treat modules separately, performing incremental analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a framework for the application of abstract interpretation as an aid during program development, rather than in the more traditional application of program optimization. Program validation and detection of errors is first performed statically by comparing (partial) specifications written in terms of assertions against information obtained from static analysis of the program. The results of this process are expressed in the user assertion language. Assertions (or parts of assertions) which cannot be verified statically are translated into run-time tests. The framework allows the use of assertions to be optional. It also allows using very general properties in assertions, beyond the predefined set understandable by the static analyzer and including properties defined by means of user programs. We also report briefly on an implementation of the framework. The resulting tool generates and checks assertions for Prolog, CLP(R), and CHIP/CLP(fd) programs, and integrates compile-time and run-time checking in a uniform way. The tool allows using properties such as types, modes, non-failure, determinacy, and computational cost, and can treat modules separately, performing incremental analysis. In practice, this modularity allows detecting statically bugs in user programs even if they do not contain any assertions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study describes and explains the experiences and perceptions of six public school teachers who had undergone the National Board for Professional Teaching Standards national certification process as a vehicle for promoting a teacher's sense of professionalism. Of these six participants, two achieved National Board certification, two did not achieve National Board certification, and two are awaiting results of their certification status. The study took place over a period of eleven months and focused on the participants' perceptions regarding the National Board certification process as it affected their sense of (a) efficacy and (b) professionalism. Data for this collective case study were gathered from interviews, portfolios and videotapes, and artifacts. Using case analysis, this study's participants' responses gathered through the interview process were examined. ^ The findings indicated that participants had concerns about the National Board certification process in the following areas: process, sense of efficacy, and sense of professionalism. All participants reported the process to be overwhelmingly demanding. Analysis of the data also reveals that those who were successful in achieving National Board certification had a greater sense of efficacy than those who did not. A disappointing finding was that the National Board process impacting participants' sense of professionalism could not be sustained; however, the participants in this study suggested the process was a step towards providing opportunities for collaboration, collegiality, and reflective practice. This study raises the question as to whether or not the espoused purposes of National Board certification are achieved via the certification process. ^