902 resultados para Validation and certification competences process


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação apresentada à Escola Superior de Educação de Lisboa para obtenção de grau de mestre em Ciências da Educação - Especialidade Educação Especial

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research analyzed critical aspects of the knowledge management process based on the analyses of knowledge, abilities and attitudes required to individual knowledge workers and to organizations responsible for the management process. In the present work a characterization of the knowledge management process was developed and information and knowledge wokers defined. Competence concept was discussed and specialists gave opinions about critical competences to knowledge management process. The opinions were organized and analyzed by the Delphi method. The results aggregate to the management context by discussing an extremely important resource to organizations - knowledge - and because they support its management process. The research identified wide critical aspects that are compatible with current organizational challenges, directing the process management to important themes as: the worker able to create, the organization able to convert individual knowledge into organizational knowledge, knowledge sharing while still tacit, the maximization organizational knowledge use, information and knowledge generation and preservation, among others important topics to be observed by knowledge workers and by administrators responsible for the knowledge management process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mestrado em Educação e Intervenção Social - Desenvolvimento Comunitário e Educação de Adultos

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The process of Competences Recognition, Validation and Certification , also known as Accreditation of Prior Learning (APL), is an innovative means of attaining school certificates for individuals without an academic background. The main objective of this process is to validate what people have learned in informal contexts, in order to attribute academic certificates. With the increasing interest of the qualification of workers and governmental support, more and more Portuguese organizations promote this process within their facilities and their work hours. This study explores the relationship between the promotion of this Human Resource Development Programme and employee’s attitudes (Job Satisfaction and Organizational Commitment) and behaviours (Extra-role Organizational Citizenship Behaviours) towards the organization they work for. Results of a cross-sectional survey of Portuguese Industrial Workers (N=135) showed that statistical significant results are in the higher levels of Voice Behaviours (a dimension of Extra-role Organizational Citizenship Behaviour in the groups of workers who were involved or had graduated from the firm promoted APL process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Matrix effects, which represent an important issue in liquid chromatography coupled to mass spectrometry or tandem mass spectrometry detection, should be closely assessed during method development. In the case of quantitative analysis, the use of stable isotope-labelled internal standard with physico-chemical properties and ionization behaviour similar to the analyte is recommended. In this paper, an example of the choice of a co-eluting deuterated internal standard to compensate for short-term and long-term matrix effect in the case of chiral (R,S)-methadone plasma quantification is reported. The method was fully validated over a concentration range of 5-800 ng/mL for each methadone enantiomer with satisfactory relative bias (-1.0 to 1.0%), repeatability (0.9-4.9%) and intermediate precision (1.4-12.0%). From the results obtained during validation, a control chart process during 52 series of routine analysis was established using both intermediate precision standard deviation and FDA acceptance criteria. The results of routine quality control samples were generally included in the +/-15% variability around the target value and mainly in the two standard deviation interval illustrating the long-term stability of the method. The intermediate precision variability estimated in method validation was found to be coherent with the routine use of the method. During this period, 257 trough concentration and 54 peak concentration plasma samples of patients undergoing (R,S)-methadone treatment were successfully analysed for routine therapeutic drug monitoring.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Carbon isotope ratio of androgens in urine specimens is routinely determined to exclude an abuse of testosterone or testosterone prohormones by athletes. Increasing application of gas chromatography/combustion/isotope ratio mass spectrometry (GC/C/IRMS) in the last years for target and systematic investigations on samples has resulted in the demand for rapid sample throughput as well as high selectivity in the extraction process particularly in the case of conspicuous samples. For that purpose, we present herein the complimentary use of an SPE-based assay and an HPLC fractionation method as a two-stage strategy for the isolation of testosterone metabolites and endogenous reference compounds prior to GC/C/IRMS analyses. Assays validation demonstrated acceptable performance in terms of intermediate precision (range: 0.1-0.4 per thousand) and Bland-Altman analyses revealed no significant bias (0.2 per thousand). For further validation of this two-stage analyses strategy, all the specimens (n=124) collected during a major sport event were processed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The reverse transcription quantitative real-time polymerase chain reaction (RT-qPCR) is a widely used, highly sensitive laboratory technique to rapidly and easily detect, identify and quantify gene expression. Reliable RT-qPCR data necessitates accurate normalization with validated control genes (reference genes) whose expression is constant in all studied conditions. This stability has to be demonstrated.We performed a literature search for studies using quantitative or semi-quantitative PCR in the rat spared nerve injury (SNI) model of neuropathic pain to verify whether any reference genes had previously been validated. We then analyzed the stability over time of 7 commonly used reference genes in the nervous system - specifically in the spinal cord dorsal horn and the dorsal root ganglion (DRG). These were: Actin beta (Actb), Glyceraldehyde-3-phosphate dehydrogenase (GAPDH), ribosomal proteins 18S (18S), L13a (RPL13a) and L29 (RPL29), hypoxanthine phosphoribosyltransferase 1 (HPRT1) and hydroxymethylbilane synthase (HMBS). We compared the candidate genes and established a stability ranking using the geNorm algorithm. Finally, we assessed the number of reference genes necessary for accurate normalization in this neuropathic pain model. RESULTS: We found GAPDH, HMBS, Actb, HPRT1 and 18S cited as reference genes in literature on studies using the SNI model. Only HPRT1 and 18S had been once previously demonstrated as stable in RT-qPCR arrays. All the genes tested in this study, using the geNorm algorithm, presented gene stability values (M-value) acceptable enough for them to qualify as potential reference genes in both DRG and spinal cord. Using the coefficient of variation, 18S failed the 50% cut-off with a value of 61% in the DRG. The two most stable genes in the dorsal horn were RPL29 and RPL13a; in the DRG they were HPRT1 and Actb. Using a 0.15 cut-off for pairwise variations we found that any pair of stable reference gene was sufficient for the normalization process. CONCLUSIONS: In the rat SNI model, we validated and ranked Actb, RPL29, RPL13a, HMBS, GAPDH, HPRT1 and 18S as good reference genes in the spinal cord. In the DRG, 18S did not fulfill stability criteria. The combination of any two stable reference genes was sufficient to provide an accurate normalization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

False identity documents constitute a potential powerful source of forensic intelligence because they are essential elements of transnational crime and provide cover for organized crime. In previous work, a systematic profiling method using false documents' visual features has been built within a forensic intelligence model. In the current study, the comparison process and metrics lying at the heart of this profiling method are described and evaluated. This evaluation takes advantage of 347 false identity documents of four different types seized in two countries whose sources were known to be common or different (following police investigations and dismantling of counterfeit factories). Intra-source and inter-sources variations were evaluated through the computation of more than 7500 similarity scores. The profiling method could thus be validated and its performance assessed using two complementary approaches to measuring type I and type II error rates: a binary classification and the computation of likelihood ratios. Very low error rates were measured across the four document types, demonstrating the validity and robustness of the method to link documents to a common source or to differentiate them. These results pave the way for an operational implementation of a systematic profiling process integrated in a developed forensic intelligence model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Intracoronary application of BM-derived cells for the treatment of acute myocardial infarction (AMI) is currently being studied intensively. Simultaneously, strict legal requirements surround the production of cells for clinical studies. Thus good manufacturing practice (GMP)-compliant collection and preparation of BM for patients with AMI was established by the Cytonet group. METHODS: As well as fulfillment of standard GMP requirements, including a manufacturing license, validation of the preparation process and the final product was performed. Whole blood (n=6) and BM (n=3) validation samples were processed under GMP conditions by gelafundin or hydroxyethylstarch sedimentation in order to reduce erythrocytes/platelets and volume and to achieve specifications defined in advance. Special attention was paid to the free potassium (<6 mmol/L), some rheologically relevant cellular characteristics (hematocrit <0.45, platelets <450 x 10(6)/mL) and the sterility of the final product. RESULTS: The data were reviewed and GMP compliance was confirmed by the German authorities (Paul-Ehrlich Institute). Forty-five BM cell preparations for clinical use were carried out following the validated methodology and standards. Additionally three selections of CD34+ BM cells for infusion were performed. All specification limits were met. Discussion In conclusion, preparation of BM cells for intracoronary application is feasible under GMP conditions. As the results of sterility testing may not be available at the time of intracoronary application, the highest possible standards to avoid bacterial and other contaminations have to be applied. The increased expense of the GMP-compliant process can be justified by higher safety for patients and better control of the final product.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We discuss a framework for the application of abstract interpretation as an aid during program development, rather than in the more traditional application of program optimization. Program validation and detection of errors is first performed statically by comparing (partial) specifications written in terms of assertions against information obtained from (global) static analysis of the program. The results of this process are expressed in the user assertion language. Assertions (or parts of assertions) which cannot be checked statically are translated into run-time tests. The framework allows the use of assertions to be optional. It also allows using very general properties in assertions, beyond the predefined set understandable by the static analyzer and including properties defined by user programs. We also report briefly on an implementation of the framework. The resulting tool generates and checks assertions for Prolog, CLP(R), and CHIP/CLP(fd) programs, and integrates compile-time and run-time checking in a uniform way. The tool allows using properties such as types, modes, non-failure, determinacy, and computational cost, and can treat modules separately, performing incremental analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a framework for the application of abstract interpretation as an aid during program development, rather than in the more traditional application of program optimization. Program validation and detection of errors is first performed statically by comparing (partial) specifications written in terms of assertions against information obtained from static analysis of the program. The results of this process are expressed in the user assertion language. Assertions (or parts of assertions) which cannot be verified statically are translated into run-time tests. The framework allows the use of assertions to be optional. It also allows using very general properties in assertions, beyond the predefined set understandable by the static analyzer and including properties defined by means of user programs. We also report briefly on an implementation of the framework. The resulting tool generates and checks assertions for Prolog, CLP(R), and CHIP/CLP(fd) programs, and integrates compile-time and run-time checking in a uniform way. The tool allows using properties such as types, modes, non-failure, determinacy, and computational cost, and can treat modules separately, performing incremental analysis. In practice, this modularity allows detecting statically bugs in user programs even if they do not contain any assertions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A presente investigação assentou na temática da Educação e Formação de Adultos, nomeadamente no processo de Reconhecimento, Validação e Certificação de Competências. Sob a configuração metodológica de um estudo de caso, teve como objeto de estudo o Centro Novas Oportunidades da Escola Secundária Gabriel Pereira. Neste trabalho de investigação procurou-se não só analisar os resultados em matéria de satisfação pessoal e profissional, decorrentes da realização de um processo de RVCC, manifestados pelos diferentes intervenientes (adultos certificados, famílias e entidades empregadoras de adultos certificados e equipa técnico-pedagógica) como também, compreender até que ponto a avaliação do processo permite consubstanciar esses mesmos resultados. Ao longo de toda a pesquisa foi definida uma estratégia metodológica que permitiu combinar uma variedade de métodos, incluindo técnicas de natureza quantitativa e qualitativa. Com o desenvolvimento desta investigação foi possível concluir que os resultados alcançados pelos adultos certificados são, do ponto de vista dos diferentes participantes, mais positivos no domínio da satisfação pessoal comparativamente à dimensão profissional, corroborando, aliás, outros estudos dirigidos à mesma problemática.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Injection stretch blow moulding is a well-established method of forming thin-walled containers and has been extensively researched for numerous years. This paper is concerned with validating the finite element analysis of the free-stretch-blow process in an effort to progress the development of injection stretch blow moulding of poly(ethylene terephthalate). Extensive data was obtained experimentally over a wide process window accounting for material temperature and air flow rate, while capturing cavity pressure, stretch-rod reaction force and preform surface strain. This data was then used to assess the accuracy of the correlating FE simulation constructed using ABAQUS/Explicit solver and an appropriate viscoelastic material subroutine. Results reveal that the simulation is able to give good quantitative correlation for conditions where the deformation was predominantly equal biaxial whilst qualitative correlation was achievable when the mode of deformation was predominantly sequential biaxial. Overall the simulation was able to pick up the general trends of how the pressure, reaction force, strain rate and strain vary with the variation in preform temperature and air flow rate. The knowledge gained from these analyses provides insight into the mechanisms of bottle formation, subsequently improving the blow moulding simulation and allowing for reduction in future development costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to translate, validate and verify the reliability of the Body Area Scale (BAS). Participants were 386 teenagers, enrolled in a private school. Translation into Portuguese was conducted. The instrument was evaluated for internal consistency and construct validation analysis. Reproducibility was evaluated using the Wilcoxon test and the coefficient of interclass correlation. The BAS demonstrated good values for internal consistency (0.90 and 0.88) and was able to discriminate boys and girls according to nutritional state (p = 0.020 and p = 0.026, respectively). BAS scores correlated with adolescents' BMI (r = 0.14, p = 0.055; r = 0.23, p = 0.001) and WC (r =0.13, p = 0.083; r = 0.22, 0.002). Reliability was confirmed by the coefficient of inter-class correlation (0.35, p < 0.001; 0.60, p < 0.001) for boys and girls, respectively. The instrument performed well in terms of understanding and time of completion. BAS was successfully translated into Portuguese and presented good validity when applied to adolescents.