995 resultados para width-strip application
Resumo:
Report on the review of selected general and application controls over the State University of Iowa PeopleSoft Human Resources Information System (HRIS) for the period June 3, 2008 through July 25, 2008
Resumo:
Objective: Reconstruction of alar structures of the nose remains difficult. The result has to be not only functional but also aesthetic. Different solutions to reconstruct alar defects are feasible. A good result that meets the specific demands on stability, aesthetics, and stable architecture without shrinkage of the area is not easily achieved. Method: A perichondrial cutaneous graft (PCCG), a graft consisting of a perichondral layer, fatty tissue, and skin that is harvested retroauriculary, is combined with an attached cartilage strip. Case Result: A 72-year-old patient suffering from basal cell carcinoma of the ala of the nose underwent the reconstructive procedure with a good result in 1 year in terms of stability, color match, and graft take. Conclusion: First, a strip of cartilage had been included in a PCCG where tumor resection required sacrifice of more than 50% of the alar rim. The case shows that one can consider a cartilage strip-enhanced PCCG graft to reconstruct alar defects.
Resumo:
Detecting local differences between groups of connectomes is a great challenge in neuroimaging, because the large number of tests that have to be performed and the impact on multiplicity correction. Any available information should be exploited to increase the power of detecting true between-group effects. We present an adaptive strategy that exploits the data structure and the prior information concerning positive dependence between nodes and connections, without relying on strong assumptions. As a first step, we decompose the brain network, i.e., the connectome, into subnetworks and we apply a screening at the subnetwork level. The subnetworks are defined either according to prior knowledge or by applying a data driven algorithm. Given the results of the screening step, a filtering is performed to seek real differences at the node/connection level. The proposed strategy could be used to strongly control either the family-wise error rate or the false discovery rate. We show by means of different simulations the benefit of the proposed strategy, and we present a real application of comparing connectomes of preschool children and adolescents.
Resumo:
O Lean não é apenas uma prática. É uma revolução nas Tecnologias de Informação (TI) proporcionando uma maior e melhor utilização dos recursos e procurando alcançar custos mais baixos dos que existem atualmente. É muito mais do que uma lista de ferramentas e metodologias e para que seja estabelecido é necessário mudar comportamentos culturais e incentivar todas as organizações a pensarem de forma diferente sobre o poder da informação versus o valor do negócio. Normalmente associa-se o Lean à criação de valor para a organização. Mas o valor é significativo quando é trazido com eficiência e resultando na eliminação de processos que consomem tempo, recursos e espaço desnecessário. Os princípios Lean podem ajudar as organizações na melhoria da qualidade, redução de custos e no alcance da eficiência através de uma melhor produtividade. Existem vários conceitos Lean que podem ser associados a diferentes objetivos de resolução de problemas. Em particular, este trabalho é uma dissertação programada para analisar um novo paradigma sobre o Lean que surgiu recentemente - Lean para Tecnologias de Informação (Lean IT). Esta dissertação apresenta uma abordagem para o Lean IT (enquadramento, objetivos e metodologia) para realizar o trabalho e utiliza um único estudo de caso, com abordagem à técnica 5S/6S (até o terceiro nível de avaliação), numa Pequena, Média Empresa (PME), de forma a demonstrar a agregação de valor e as vantagens na eliminação de resíduos/desperdícios nos seus processos. A técnica também mostra a evolução da avaliação antes e depois da sua aplicação. Este estudo de caso individual avalia um Departamento de TI (com uma equipe de cinco colaboradores e um chefe de Departamento) através da observação direta, documentação e arquivos de registos e os equipamentos analisados são computadores, postos de trabalho e projetos (código desenvolvido, portais e outros serviços de TI). x Como guia, a metodologia inclui a preparação da avaliação em conjunto com o responsável/chefe do Departamento de TI, o desenrolar das operações, a identificação do fluxo de valor para cada atividade, o desenvolvimento de um plano de comunicação e a análise de cada passo da avaliação do fluxo de processamento. Os principais resultados estão refletidos nas novas ferramentas de trabalho (Microsoft SharePoint e Microsoft Project em detrimento do Microsoft Excel) que fornecem comunicação remota e controlo de projetos para todos os stakeholders, tais como, a gestão de topo, parceiros e clientes (algumas organizações incluem o Outsourcing no desenvolvimento de funcionalidades específicas). Os resultados também refletem-se na qualidade do trabalho, no cumprimento de prazos, na segurança física e lógica, na motivação dos colaboradores e na satisfação dos clientes. A técnica 5S/6S ajuda a esclarecer os conceitos e princípios Lean, exequibilidade e aumenta a curiosidade sobre a implementação da técnica noutros departamentos tais como o Financeiro e ou o de Recursos Humanos. Como forma de consolidação do trabalho tornou-se possível organizar a avaliação para que a organização possa candidatar-se a uma certificação na norma ISO/IEC 25010:2011, no modelo de qualidade de software (software é o core business desta PME). Mas tal só será possível se toda a organização atingir a padronização dos seus processos. Este estudo de caso mostra que os conceitos Lean e a aplicação de uma ou mais das suas técnicas (neste caso particular a técnica 5S/6S) ajuda a obter melhores resultados através da gestão e melhoria dos seus principais serviços.
Resumo:
Report on a review of selected general and application controls over the Iowa Public Employees’ Retirement System (IPERS) I-Que Pension Administration System for the period March 24, 2009 through April 22, 2009
Resumo:
RESUME La méthode de la spectroscopie Raman est une technique d'analyse chimique basée sur l'exploitation du phénomène de diffusion de la lumière (light scattering). Ce phénomène fut observé pour la première fois en 1928 par Raman et Krishnan. Ces observations permirent à Raman d'obtenir le Prix Nobel en physique en 1930. L'application de la spectroscopie Raman a été entreprise pour l'analyse du colorant de fibres textiles en acrylique, en coton et en laine de couleurs bleue, rouge et noire. Nous avons ainsi pu confirmer que la technique est adaptée pour l'analyse in situ de traces de taille microscopique. De plus, elle peut être qualifiée de rapide, non destructive et ne nécessite aucune préparation particulière des échantillons. Cependant, le phénomène de la fluorescence s'est révélé être l'inconvénient le plus important. Lors de l'analyse des fibres, différentes conditions analytiques ont été testées et il est apparu qu'elles dépendaient surtout du laser choisi. Son potentiel pour la détection et l'identification des colorants imprégnés dans les fibres a été confirmé dans cette étude. Une banque de données spectrale comprenant soixante colorants de référence a été réalisée dans le but d'identifier le colorant principal imprégné dans les fibres collectées. De plus, l'analyse de différents blocs de couleur, caractérisés par des échantillons d'origine inconnue demandés à diverses personnes, a permis de diviser ces derniers en plusieurs groupes et d'évaluer la rareté des configurations des spectres Raman obtenus. La capacité de la technique Raman à différencier ces échantillons a été évaluée et comparée à celle des méthodes conventionnelles pour l'analyse des fibres textiles, à savoir la micro spectrophotométrie UV-Vis (MSP) et la chromatographie sur couche mince (CCM). La technique Raman s'est révélée être moins discriminatoire que la MSP pour tous les blocs de couleurs considérés. C'est pourquoi dans le cadre d'une séquence analytique nous recommandons l'utilisation du Raman après celle de la méthode d'analyse de la couleur, à partir d'un nombre de sources lasers le plus élevé possible. Finalement, la possibilité de disposer d'instruments équipés avec plusieurs longueurs d'onde d'excitation, outre leur pouvoir de réduire la fluorescence, permet l'exploitation d'un plus grand nombre d'échantillons. ABSTRACT Raman spectroscopy allows for the measurement of the inelastic scattering of light due to the vibrational modes of a molecule when irradiated by an intense monochromatic source such as a laser. Such a phenomenon was observed for the first time by Raman and Krishnan in 1928. For this observation, Raman was awarded with the Nobel Prize in Physics in 1930. The application of Raman spectroscopy has been undertaken for the dye analysis of textile fibers. Blue, black and red acrylics, cottons and wools were examined. The Raman technique presents advantages such as non-destructive nature, fast analysis time, and the possibility of performing microscopic in situ analyses. However, the problem of fluorescence was often encountered. Several aspects were investigated according to the best analytical conditions for every type/color fiber combination. The potential of the technique for the detection and identification of dyes was confirmed. A spectral database of 60 reference dyes was built to detect the main dyes used for the coloration of fiber samples. Particular attention was placed on the discriminating power of the technique. Based on the results from the Raman analysis for the different blocs of color submitted to analyses, it was possible to obtain different classes of fibers according to the general shape of spectra. The ability of Raman spectroscopy to differentiate samples was compared to the one of the conventional techniques used for the analysis of textile fibers, like UV-Vis Microspectrophotometry (UV-Vis MSP) and thin layer chromatography (TLC). The Raman technique resulted to be less discriminative than MSP for every bloc of color considered in this study. Thus, it is recommended to use Raman spectroscopy after MSP and light microscopy to be considered for an analytical sequence. It was shown that using several laser wavelengths allowed for the reduction of fluorescence and for the exploitation of a higher number of samples.
Resumo:
Estudi centrat en el paper de la comunicació no verbal com a eina docent per a la gestió de l’aula, prenent com a referència el model de comunicació de Michael Grinder (Pentimento), basat en la Programació Neuro-lingüística (PNL). Aquest model s’analitza i es compara amb altres models i estudis sobre la comunicació no verbal, per establir-ne similituds i diferències. Per tal d’avaluar l’eficàcia de les tècniques de gestió de l’aula a través de la comunicació no verbal proposades per Grinder en un context educatiu real, s’inclouen i s’analitzen enregistraments de la implementació de diferents tècniques en un institut de secundària de Catalunya. Tota la informació recollida i analitzada permet valorar i ressaltar com és de significatiu tot allò que s’expressa més enllà del llenguatge, i per tant, com són d’importants i d’útils les habilitats comunicatives d’un professor en la seva tasca d’ensenyar.
Resumo:
This work describes the characteristics of a representative set of seven different virtual laboratories (VLs) aimed for science teaching in secondary school. For this purpose, a 27-item evaluation model that facilitates the characterization of the VLs was prepared. The model takes into account the gaming features, the overall usability, and also the potential to induce scientific literacy. Five of the seven VLs were then tested with two larger and highly heterogenic groups of students, and in two different contexts – biotechnology and physics, respectively. It is described how the VLs were received by the students, taking into account both their motivation and their self-reported learning outcome. In some cases, students’ approach to work with the VLs was recorded digitally, and analyzed qualitatively. In general, the students enjoyed the VL activities, and claimed that they learned from them. Yet, more investigation is required to address the effectiveness of these tools for significant learning.
Resumo:
Several ink dating methods based on solvents analysis using gas chromatography/mass spectrometry (GC/MS) were proposed in the last decades. These methods follow the drying of solvents from ballpoint pen inks on paper and seem very promising. However, several questions arose over the last few years among questioned documents examiners regarding the transparency and reproducibility of the proposed techniques. These questions should be carefully studied for accurate and ethical application of this methodology in casework. Inspired by a real investigation involving ink dating, the present paper discusses this particular issue throughout four main topics: aging processes, dating methods, validation procedures and data interpretation. This work presents a wide picture of the ink dating field, warns about potential shortcomings and also proposes some solutions to avoid reporting errors in court.
Resumo:
BACKGROUND: Knowledge of normal heart weight ranges is important information for pathologists. Comparing the measured heart weight to reference values is one of the key elements used to determine if the heart is pathological, as heart weight increases in many cardiac pathologies. The current reference tables are old and in need of an update. AIMS: The purposes of this study are to establish new reference tables for normal heart weights in the local population and to determine the best predictive factor for normal heart weight. We also aim to provide technical support to calculate the predictive normal heart weight. METHODS: The reference values are based on retrospective analysis of adult Caucasian autopsy cases without any obvious pathology that were collected at the University Centre of Legal Medicine in Lausanne from 2007 to 2011. We selected 288 cases. The mean age was 39.2 years. There were 118 men and 170 women. Regression analyses were performed to assess the relationship of heart weight to body weight, body height, body mass index (BMI) and body surface area (BSA). RESULTS: The heart weight increased along with an increase in all the parameters studied. The mean heart weight was greater in men than in women at a similar body weight. BSA was determined to be the best predictor for normal heart weight. New reference tables for predicted heart weights are presented as a web application that enable the comparison of heart weights observed at autopsy with the reference values. CONCLUSIONS: The reference tables for heart weight and other organs should be systematically updated and adapted for the local population. Web access and smartphone applications for the predicted heart weight represent important investigational tools.
Resumo:
Traditional culture-dependent methods to quantify and identify airborne microorganisms are limited by factors such as short-duration sampling times and inability to count nonculturableor non-viable bacteria. Consequently, the quantitative assessment of bioaerosols is often underestimated. Use of the real-time quantitative polymerase chain reaction (Q-PCR) to quantify bacteria in environmental samples presents an alternative method, which should overcome this problem. The aim of this study was to evaluate the performance of a real-time Q-PCR assay as a simple and reliable way to quantify the airborne bacterial load within poultry houses and sewage treatment plants, in comparison with epifluorescencemicroscopy and culture-dependent methods. The estimates of bacterial load that we obtained from real-time PCR and epifluorescence methods, are comparable, however, our analysis of sewage treatment plants indicate these methods give values 270-290 fold greater than those obtained by the ''impaction on nutrient agar'' method. The culture-dependent method of air impaction on nutrient agar was also inadequate in poultry houses, as was the impinger-culture method, which gave a bacterial load estimate 32-fold lower than obtained by Q-PCR. Real-time quantitative PCR thus proves to be a reliable, discerning, and simple method that could be used to estimate airborne bacterial load in a broad variety of other environments expected to carry high numbers of airborne bacteria. [Authors]
Resumo:
The Rational-Experiential Inventory REI (Pacini and Epstein, 1999) is a self-administered test comprising two scales measuring the attitude of respondents towards two thinking styles respectively referred to as the rational and the experiential thinking styles. Two validation studies were conducted using a new French-language version of the REI. The first study confirms the validity of the French translation. The second study, which is concerned with the REI's construct validity, assesses the questionnaire's capacity to discriminate between a group of smokers and a group of non-smokers. Both studies give generally satisfactory results. In particular, the advantages of using the two-dimensional REI rather than the better known Need For Cognition scale (Cacioppo & Petty, 1982) are made quite clear.