978 resultados para Processing technique
Resumo:
The differentiation between benign and malignant focal liver lesions plays an important role in diagnosis of liver disease and therapeutic planning of local or general disease. This differentiation, based on characterization, relies on the observation of the dynamic vascular patterns (DVP) of lesions with respect to adjacent parenchyma, and may be assessed during contrast-enhanced ultrasound imaging after a bolus injection. For instance, hemangiomas (i.e., benign lesions) exhibit hyper-enhanced signatures over time, whereas metastases (i.e., malignant lesions) frequently present hyperenhanced foci during the arterial phase and always become hypo-enhanced afterwards. The objective of this work was to develop a new parametric imaging technique, aimed at mapping the DVP signatures into a single image called a DVP parametric image, conceived as a diagnostic aid tool for characterizing lesion types. The methodology consisted in processing a time sequence of images (DICOM video data) using four consecutive steps: (1) pre-processing combining image motion correction and linearization to derive an echo-power signal, in each pixel, proportional to local contrast agent concentration over time; (2) signal modeling, by means of a curve-fitting optimization, to compute a difference signal in each pixel, as the subtraction of adjacent parenchyma kinetic from the echopower signal; (3) classification of difference signals; and (4) parametric image rendering to represent classified pixels as a support for diagnosis. DVP parametric imaging was the object of a clinical assessment on a total of 146 lesions, imaged using different medical ultrasound systems. The resulting sensitivity and specificity were 97% and 91%, respectively, which compare favorably with scores of 81 to 95% and 80 to 95% reported in medical literature for sensitivity and specificity, respectively.
Resumo:
Maturation of the arenavirus GP precursor (GPC) involves proteolytic processing by cellular signal peptidase and the proprotein convertase subtilisin kexin isozyme 1 (SKI-1)/site 1 protease (S1P), yielding a tripartite complex comprised of a stable signal peptide (SSP), the receptor-binding GP1, and the fusion-active transmembrane GP2. Here we investigated the roles of SKI-1/S1P processing and SSP in the biosynthesis of the recombinant GP ectodomains of lymphocytic choriomeningitis virus (LCMV) and Lassa virus (LASV). When expressed in mammalian cells, the LCMV and LASV GP ectodomains underwent processing by SKI-1/S1P, followed by dissociation of GP1 from GP2. The GP2 ectodomain spontaneously formed trimers as revealed by chemical cross-linking. The endogenous SSP, known to be crucial for maturation and transport of full-length arenavirus GPC was dispensable for processing and secretion of the soluble GP ectodomain, suggesting a specific role of SSP in the stable prefusion conformation and transport of full-length GPC.
Resumo:
Geobiota are defined by taxic assemblages (i.e., biota) and their defining abiotic breaks, which are mapped in cross-section to reveal past and future biotic boundaries. We term this conceptual approach Temporal Geobiotic Mapping (TGM) and offer it as a conceptual approach for biogeography. TGM is based on geological cross-sectioning, which creates maps based on the distribution of biota and known abiotic factors that drive their distribution, such as climate, topography, soil chemistry and underlying geology. However, the availability of abiotic data is limited for many areas. Unlike other approaches, TGM can be used when there is minimal data available. In order to demonstrate TGM, we use the well-known area in the Blue Mountains, New South Wales (NSW), south-eastern Australia and show how surface processes such as weathering and erosion affect the future distribution of a Moist Basalt Forest taxic assemblage. Biotic areas are best represented visually as maps, which can show transgressions and regressions of biota and abiota over time. Using such maps, a biogeographer can directly compare animal and plant distributions with features in the abiotic environment and may identify significant geographical barriers or pathways that explain biotic distributions.
Resumo:
Totally extraperitoneal laparoscopic hernia repair is an efficient but technically demanding procedure. As mechanisms of hernia recurrence may be related to these technical difficulties, we have modified a previously described double-mesh technique in an effort to simplify the procedure. Extraperitoneal laparoscopic hernia repairs were performed in 82 male and 17 female patients having inguinal, femoral, and recurrent bilateral hernias. A standard propylene mesh measuring 15 x 15 cm was cut into two pieces of 4 x 15 cm and 11 x 15 cm. The smaller mesh was placed over both inguinal rings without splitting. The larger mesh was then inserted over the first mesh and stapled to low-risk zones, reinforcing the large-vessel area and the nerve transition zone. The mean procedure duration was 60 minutes for unilateral and 100 minutes for bilateral hernia repair. Patients were discharged from the hospital within 48 hours. The mean postoperative follow-up was 22 months, with no recurrences, neuralgia, or bleeding complications. Over a 2-year period, this technique was found to be satisfactory without recurrences or significant complications. In our hands, this technique was easier to perform: it allows for a less than perfect positioning of the meshes and avoids most of the stapling to crucial zones.
Resumo:
Toxorhynchites mosquitoes play important ecological roles in aquatic microenvironments, and are frequently investigated as potential biological control agents of mosquito disease vectors. Establishment of Toxorhynchites laboratory colonies can be challenging because for some species, mating and insemination either do not occur or require a prohibitive amount of laboratory space for success. Consequently, artificial insemination techniques have been developed to assist with mass rearing of these species. Herein we describe an adapted protocol for colony establishment of T. theobaldi, a species with broad distribution in the Neotropics. The success of the technique and its implications are discussed.
Resumo:
O Lean não é apenas uma prática. É uma revolução nas Tecnologias de Informação (TI) proporcionando uma maior e melhor utilização dos recursos e procurando alcançar custos mais baixos dos que existem atualmente. É muito mais do que uma lista de ferramentas e metodologias e para que seja estabelecido é necessário mudar comportamentos culturais e incentivar todas as organizações a pensarem de forma diferente sobre o poder da informação versus o valor do negócio. Normalmente associa-se o Lean à criação de valor para a organização. Mas o valor é significativo quando é trazido com eficiência e resultando na eliminação de processos que consomem tempo, recursos e espaço desnecessário. Os princípios Lean podem ajudar as organizações na melhoria da qualidade, redução de custos e no alcance da eficiência através de uma melhor produtividade. Existem vários conceitos Lean que podem ser associados a diferentes objetivos de resolução de problemas. Em particular, este trabalho é uma dissertação programada para analisar um novo paradigma sobre o Lean que surgiu recentemente - Lean para Tecnologias de Informação (Lean IT). Esta dissertação apresenta uma abordagem para o Lean IT (enquadramento, objetivos e metodologia) para realizar o trabalho e utiliza um único estudo de caso, com abordagem à técnica 5S/6S (até o terceiro nível de avaliação), numa Pequena, Média Empresa (PME), de forma a demonstrar a agregação de valor e as vantagens na eliminação de resíduos/desperdícios nos seus processos. A técnica também mostra a evolução da avaliação antes e depois da sua aplicação. Este estudo de caso individual avalia um Departamento de TI (com uma equipe de cinco colaboradores e um chefe de Departamento) através da observação direta, documentação e arquivos de registos e os equipamentos analisados são computadores, postos de trabalho e projetos (código desenvolvido, portais e outros serviços de TI). x Como guia, a metodologia inclui a preparação da avaliação em conjunto com o responsável/chefe do Departamento de TI, o desenrolar das operações, a identificação do fluxo de valor para cada atividade, o desenvolvimento de um plano de comunicação e a análise de cada passo da avaliação do fluxo de processamento. Os principais resultados estão refletidos nas novas ferramentas de trabalho (Microsoft SharePoint e Microsoft Project em detrimento do Microsoft Excel) que fornecem comunicação remota e controlo de projetos para todos os stakeholders, tais como, a gestão de topo, parceiros e clientes (algumas organizações incluem o Outsourcing no desenvolvimento de funcionalidades específicas). Os resultados também refletem-se na qualidade do trabalho, no cumprimento de prazos, na segurança física e lógica, na motivação dos colaboradores e na satisfação dos clientes. A técnica 5S/6S ajuda a esclarecer os conceitos e princípios Lean, exequibilidade e aumenta a curiosidade sobre a implementação da técnica noutros departamentos tais como o Financeiro e ou o de Recursos Humanos. Como forma de consolidação do trabalho tornou-se possível organizar a avaliação para que a organização possa candidatar-se a uma certificação na norma ISO/IEC 25010:2011, no modelo de qualidade de software (software é o core business desta PME). Mas tal só será possível se toda a organização atingir a padronização dos seus processos. Este estudo de caso mostra que os conceitos Lean e a aplicação de uma ou mais das suas técnicas (neste caso particular a técnica 5S/6S) ajuda a obter melhores resultados através da gestão e melhoria dos seus principais serviços.
Resumo:
Biosynthesis of active endothelin-1 (ET-1) implies an enzymatic processing of the inactive precursor Big ET-1 (1-39) into the mature, 21 amino acid peptide. The aim of this study was to characterize in airway and alveolar epithelial cells the enzymes responsible for this activation. BEAS-2B and A549 cells, which both produce ET-1, were studied in vitro as models for bronchiolar and alveolar cells, respectively. Both cell lines were able to convert exogenously added Big ET-1 (0.1 microM) into ET-1, suggesting a cell surface or an extracellular processing. The conversion was inhibited by phosphoramidon in both cell lines with an IC50 approximately 1 microM, but not by thiorphan, a specific inhibitor of neutral endopeptidase 24.11 (NEP). The endogenous production of serum-stimulated BEAS-2B and A549 cells was not inhibited by thiorphan, and phosphoramidon showed inhibition only at high concentration (>100 microM). Western blotting following electrophoresis in reducing conditions demonstrated a protein of MR 110 corresponding to the ECE-1 monomer in both BEAS-2B and A549 cells, as well as in whole lung extracts. By RT-PCR we revealed the mRNA encoding for the ECE-1b and/or -1c subtype, but not ECE-1a, in both cell lines. We conclude that BEAS-2B and A549 cells are able to process either endogenous or exogenous Big ET-1 by ECE-1 and that isoforms 1b and 1c could be involved in this processing with no significant role of NEP.
Resumo:
La sostenibilidad de los recursos marinos y de su ecosistema hace necesario un manejo responsable de las pesquerías. Conocer la distribución espacial del esfuerzo pesquero y en particular de las operaciones de pesca es indispensable para mejorar el monitoreo pesquero y el análisis de la vulnerabilidad de las especies frente a la pesca. Actualmente en la pesquería de anchoveta peruana, se recoge información del esfuerzo y capturas mediante un programa de observadores a bordo, pero esta solo representa una muestra de 2% del total de viajes pesqueros. Por otro lado, se dispone de información por cada hora (en promedio) de la posición de cada barco de la flota gracias al sistema de seguimiento satelital de las embarcaciones (VMS), aunque en estos no se señala cuándo ni dónde ocurrieron las calas. Las redes neuronales artificiales (ANN) podrían ser un método estadístico capaz de inferir esa información, entrenándose en una muestra para la cual sí conocemos las posiciones de calas (el 2% anteriormente referido), estableciendo relaciones analíticas entre las calas y ciertas características geométricas de las trayectorias observadas por el VMS y así, a partir de las últimas, identificar la posición de las operaciones de pesca. La aplicación de la red neuronal requiere un análisis previo que examine la sensibilidad de la red a variaciones en sus parámetros y bases de datos de entrenamiento, y que nos permita desarrollar criterios para definir la estructura de la red e interpretar sus resultados de manera adecuada. La problemática descrita en el párrafo anterior, aplicada específicamente a la anchoveta (Engraulis ringens) es detalllada en el primer capítulo, mientras que en el segundo se hace una revisión teórica de las redes neuronales. Luego se describe el proceso de construcción y pre-tratamiento de la base de datos, y definición de la estructura de la red previa al análisis de sensibilidad. A continuación se presentan los resultados para el análisis en los que obtenemos una estimación del 100% de calas, de las cuales aproximadamente 80% están correctamente ubicadas y 20% poseen un error de ubicación. Finalmente se discuten las fortalezas y debilidades de la técnica empleada, de métodos alternativos potenciales y de las perspectivas abiertas por este trabajo.
Resumo:
Iowa Grain Facilities Map
Resumo:
Three-dimensional analysis of the entire sequence in ski jumping is recommended when studying the kinematics or evaluating performance. Camera-based systems which allow three-dimensional kinematics measurement are complex to set-up and require extensive post-processing, usually limiting ski jumping analyses to small numbers of jumps. In this study, a simple method using a wearable inertial sensors-based system is described to measure the orientation of the lower-body segments (sacrum, thighs, shanks) and skis during the entire jump sequence. This new method combines the fusion of inertial signals and biomechanical constraints of ski jumping. Its performance was evaluated in terms of validity and sensitivity to different performances based on 22 athletes monitored during daily training. The validity of the method was assessed by comparing the inclination of the ski and the slope at landing point and reported an error of -0.2±4.8°. The validity was also assessed by comparison of characteristic angles obtained with the proposed system and reference values in the literature; the differences were smaller than 6° for 75% of the angles and smaller than 15° for 90% of the angles. The sensitivity to different performances was evaluated by comparing the angles between two groups of athletes with different jump lengths and by assessing the association between angles and jump lengths. The differences of technique observed between athletes and the associations with jumps length agreed with the literature. In conclusion, these results suggest that this system is a promising tool for a generalization of three-dimensional kinematics analysis in ski jumping.
Resumo:
La tècnica de l’electroencefalograma (EEG) és una de les tècniques més utilitzades per estudiar el cervell. En aquesta tècnica s’enregistren els senyals elèctrics que es produeixen en el còrtex humà a través d’elèctrodes col•locats al cap. Aquesta tècnica, però, presenta algunes limitacions a l’hora de realitzar els enregistraments, la principal limitació es coneix com a artefactes, que són senyals indesitjats que es mesclen amb els senyals EEG. L’objectiu d’aquest treball de final de màster és presentar tres nous mètodes de neteja d’artefactes que poden ser aplicats en EEG. Aquests estan basats en l’aplicació de la Multivariate Empirical Mode Decomposition, que és una nova tècnica utilitzada per al processament de senyal. Els mètodes de neteja proposats s’apliquen a dades EEG simulades que contenen artefactes (pestanyeigs), i un cop s’han aplicat els procediments de neteja es comparen amb dades EEG que no tenen pestanyeigs, per comprovar quina millora presenten. Posteriorment, dos dels tres mètodes de neteja proposats s’apliquen sobre dades EEG reals. Les conclusions que s’han extret del treball són que dos dels nous procediments de neteja proposats es poden utilitzar per realitzar el preprocessament de dades reals per eliminar pestanyeigs.