983 resultados para Scanline sampling technique


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Toxorhynchites mosquitoes play important ecological roles in aquatic microenvironments, and are frequently investigated as potential biological control agents of mosquito disease vectors. Establishment of Toxorhynchites laboratory colonies can be challenging because for some species, mating and insemination either do not occur or require a prohibitive amount of laboratory space for success. Consequently, artificial insemination techniques have been developed to assist with mass rearing of these species. Herein we describe an adapted protocol for colony establishment of T. theobaldi, a species with broad distribution in the Neotropics. The success of the technique and its implications are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O Lean não é apenas uma prática. É uma revolução nas Tecnologias de Informação (TI) proporcionando uma maior e melhor utilização dos recursos e procurando alcançar custos mais baixos dos que existem atualmente. É muito mais do que uma lista de ferramentas e metodologias e para que seja estabelecido é necessário mudar comportamentos culturais e incentivar todas as organizações a pensarem de forma diferente sobre o poder da informação versus o valor do negócio. Normalmente associa-se o Lean à criação de valor para a organização. Mas o valor é significativo quando é trazido com eficiência e resultando na eliminação de processos que consomem tempo, recursos e espaço desnecessário. Os princípios Lean podem ajudar as organizações na melhoria da qualidade, redução de custos e no alcance da eficiência através de uma melhor produtividade. Existem vários conceitos Lean que podem ser associados a diferentes objetivos de resolução de problemas. Em particular, este trabalho é uma dissertação programada para analisar um novo paradigma sobre o Lean que surgiu recentemente - Lean para Tecnologias de Informação (Lean IT). Esta dissertação apresenta uma abordagem para o Lean IT (enquadramento, objetivos e metodologia) para realizar o trabalho e utiliza um único estudo de caso, com abordagem à técnica 5S/6S (até o terceiro nível de avaliação), numa Pequena, Média Empresa (PME), de forma a demonstrar a agregação de valor e as vantagens na eliminação de resíduos/desperdícios nos seus processos. A técnica também mostra a evolução da avaliação antes e depois da sua aplicação. Este estudo de caso individual avalia um Departamento de TI (com uma equipe de cinco colaboradores e um chefe de Departamento) através da observação direta, documentação e arquivos de registos e os equipamentos analisados são computadores, postos de trabalho e projetos (código desenvolvido, portais e outros serviços de TI). x Como guia, a metodologia inclui a preparação da avaliação em conjunto com o responsável/chefe do Departamento de TI, o desenrolar das operações, a identificação do fluxo de valor para cada atividade, o desenvolvimento de um plano de comunicação e a análise de cada passo da avaliação do fluxo de processamento. Os principais resultados estão refletidos nas novas ferramentas de trabalho (Microsoft SharePoint e Microsoft Project em detrimento do Microsoft Excel) que fornecem comunicação remota e controlo de projetos para todos os stakeholders, tais como, a gestão de topo, parceiros e clientes (algumas organizações incluem o Outsourcing no desenvolvimento de funcionalidades específicas). Os resultados também refletem-se na qualidade do trabalho, no cumprimento de prazos, na segurança física e lógica, na motivação dos colaboradores e na satisfação dos clientes. A técnica 5S/6S ajuda a esclarecer os conceitos e princípios Lean, exequibilidade e aumenta a curiosidade sobre a implementação da técnica noutros departamentos tais como o Financeiro e ou o de Recursos Humanos. Como forma de consolidação do trabalho tornou-se possível organizar a avaliação para que a organização possa candidatar-se a uma certificação na norma ISO/IEC 25010:2011, no modelo de qualidade de software (software é o core business desta PME). Mas tal só será possível se toda a organização atingir a padronização dos seus processos. Este estudo de caso mostra que os conceitos Lean e a aplicação de uma ou mais das suas técnicas (neste caso particular a técnica 5S/6S) ajuda a obter melhores resultados através da gestão e melhoria dos seus principais serviços.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The precise sampling of soil, biological or micro climatic attributes in tropical forests, which are characterized by a high diversity of species and complex spatial variability, is a difficult task. We found few basic studies to guide sampling procedures. The objective of this study was to define a sampling strategy and data analysis for some parameters frequently used in nutrient cycling studies, i. e., litter amount, total nutrient amounts in litter and its composition (Ca, Mg, Κ, Ν and P), and soil attributes at three depths (organic matter, Ρ content, cation exchange capacity and base saturation). A natural remnant forest in the West of São Paulo State (Brazil) was selected as study area and samples were collected in July, 1989. The total amount of litter and its total nutrient amounts had a high spatial independent variance. Conversely, the variance of litter composition was lower and the spatial dependency was peculiar to each nutrient. The sampling strategy for the estimation of litter amounts and the amount of nutrient in litter should be different than the sampling strategy for nutrient composition. For the estimation of litter amounts and the amount of nutrients in litter (related to quantity) a large number of randomly distributed determinations are needed. Otherwise, for the estimation of litter nutrient composition (related to quality) a smaller amount of spatially located samples should be analyzed. The determination of sampling for soil attributes differed according to the depth. Overall, surface samples (0-5 cm) showed high short distance spatial dependent variance, whereas, subsurface samples exhibited spatial dependency in longer distances. Short transects with sampling interval of 5-10 m are recommended for surface sampling. Subsurface samples must also be spatially located, but with transects or grids with longer distances between sampling points over the entire area. Composite soil samples would not provide a complete understanding of the relation between soil properties and surface dynamic processes or landscape aspects. Precise distribution of Ρ was difficult to estimate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Designing an efficient sampling strategy is of crucial importance for habitat suitability modelling. This paper compares four such strategies, namely, 'random', 'regular', 'proportional-stratified' and 'equal -stratified'- to investigate (1) how they affect prediction accuracy and (2) how sensitive they are to sample size. In order to compare them, a virtual species approach (Ecol. Model. 145 (2001) 111) in a real landscape, based on reliable data, was chosen. The distribution of the virtual species was sampled 300 times using each of the four strategies in four sample sizes. The sampled data were then fed into a GLM to make two types of prediction: (1) habitat suitability and (2) presence/ absence. Comparing the predictions to the known distribution of the virtual species allows model accuracy to be assessed. Habitat suitability predictions were assessed by Pearson's correlation coefficient and presence/absence predictions by Cohen's K agreement coefficient. The results show the 'regular' and 'equal-stratified' sampling strategies to be the most accurate and most robust. We propose the following characteristics to improve sample design: (1) increase sample size, (2) prefer systematic to random sampling and (3) include environmental information in the design'

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To investigate the ability of inversion recovery ON-resonant water suppression (IRON) in conjunction with P904 (superparamagnetic nanoparticles which consisting of a maghemite core coated with a low-molecular-weight amino-alcohol derivative of glucose) to perform steady-state equilibrium phase MR angiography (MRA) over a wide dose range. MATERIALS AND METHODS: Experiments were approved by the institutional animal care committee. Rabbits (n = 12) were imaged at baseline and serially after the administration of 10 incremental dosages of 0.57-5.7 mgFe/Kg P904. Conventional T1-weighted and IRON MRA were obtained on a clinical 1.5 Tesla (T) scanner to image the thoracic and abdominal aorta, and peripheral vessels. Contrast-to-noise ratios (CNR) and vessel sharpness were quantified. RESULTS: Using IRON MRA, CNR and vessel sharpness progressively increased with incremental dosages of the contrast agent P904, exhibiting constantly higher contrast values than T1 -weighted MRA over a very wide range of contrast agent doses (CNR of 18.8 ± 5.6 for IRON versus 11.1 ± 2.8 for T1 -weighted MRA at 1.71 mgFe/kg, P = 0.02 and 19.8 ± 5.9 for IRON versus -0.8 ± 1.4 for T1-weighted MRA at 3.99 mgFe/kg, P = 0.0002). Similar results were obtained for vessel sharpness in peripheral vessels, (Vessel sharpness of 46.76 ± 6.48% for IRON versus 33.20 ± 3.53% for T1-weighted MRA at 1.71 mgFe/Kg, P = 0.002, and of 48.66 ± 5.50% for IRON versus 19.00 ± 7.41% for T1-weighted MRA at 3.99 mgFe/Kg, P = 0.003). CONCLUSION: Our study suggests that quantitative CNR and vessel sharpness after the injection of P904 are consistently higher for IRON MRA when compared with conventional T1-weighted MRA. These findings apply for a wide range of contrast agent dosages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Weather radar observations are currently the most reliable method for remote sensing of precipitation. However, a number of factors affect the quality of radar observations and may limit seriously automated quantitative applications of radar precipitation estimates such as those required in Numerical Weather Prediction (NWP) data assimilation or in hydrological models. In this paper, a technique to correct two different problems typically present in radar data is presented and evaluated. The aspects dealt with are non-precipitating echoes - caused either by permanent ground clutter or by anomalous propagation of the radar beam (anaprop echoes) - and also topographical beam blockage. The correction technique is based in the computation of realistic beam propagation trajectories based upon recent radiosonde observations instead of assuming standard radio propagation conditions. The correction consists of three different steps: 1) calculation of a Dynamic Elevation Map which provides the minimum clutter-free antenna elevation for each pixel within the radar coverage; 2) correction for residual anaprop, checking the vertical reflectivity gradients within the radar volume; and 3) topographical beam blockage estimation and correction using a geometric optics approach. The technique is evaluated with four case studies in the region of the Po Valley (N Italy) using a C-band Doppler radar and a network of raingauges providing hourly precipitation measurements. The case studies cover different seasons, different radio propagation conditions and also stratiform and convective precipitation type events. After applying the proposed correction, a comparison of the radar precipitation estimates with raingauges indicates a general reduction in both the root mean squared error and the fractional error variance indicating the efficiency and robustness of the procedure. Moreover, the technique presented is not computationally expensive so it seems well suited to be implemented in an operational environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to quantitatively characterize the climatology of daily precipitation indices in Catalonia (northeastern Iberian Peninsula) from 1951 to 2003. This work has been performed analyzing a subset of the ETCCDI (Expert Team on Climate Change Detection and Indices) precipitation indices calculated from a new interpolated dataset of daily precipitation, namely SPAIN02, regular at 0.2° horizontal resolution (around 20 km) and from two high-quality stations: the Ebro and Fabra observatories. Using a jack-knife technique, we have found that the sampling error of the SPAIN02 regional averaged is relatively low. The trend analysis has been implemented using a Circular Block Bootstrap procedure applicable to non-normal distributions and autocorrelated series. A running trend analysis has been applied to analyze the trend persistence. No general trends at a regional scale are observed, considering the annual or the seasonal regional averaged series of all the indices for all the time windows considered. Only the consecutive dry days index (CDD) at annual scale shows a locally coherent spatial trend pattern; around 30% of the Catalonia area has experienced an increase of around 2¿3 days decade¿1. The Ebro and Fabra observatories show a similar CDD trend, mainly due to the summer contribution. Besides this, a significant decrease in total precipitation (around ¿10 mm decade¿1) and in the index "highest precipitation amount in five-day period" (RX5DAY, around ¿5 mm decade¿1), have been found in summer for the Ebro observatory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To test whether quantitative traits are under directional or homogenizing selection, it is common practice to compare population differentiation estimates at molecular markers (F(ST)) and quantitative traits (Q(ST)). If the trait is neutral and its determinism is additive, then theory predicts that Q(ST) = F(ST), while Q(ST) > F(ST) is predicted under directional selection for different local optima, and Q(ST) < F(ST) is predicted under homogenizing selection. However, nonadditive effects can alter these predictions. Here, we investigate the influence of dominance on the relation between Q(ST) and F(ST) for neutral traits. Using analytical results and computer simulations, we show that dominance generally deflates Q(ST) relative to F(ST). Under inbreeding, the effect of dominance vanishes, and we show that for selfing species, a better estimate of Q(ST) is obtained from selfed families than from half-sib families. We also compare several sampling designs and find that it is always best to sample many populations (>20) with few families (five) rather than few populations with many families. Provided that estimates of Q(ST) are derived from individuals originating from many populations, we conclude that the pattern Q(ST) > F(ST), and hence the inference of directional selection for different local optima, is robust to the effect of nonadditive gene actions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In arson cases, the collection and detection of traces of ignitable liquids on a suspect's hands can provide information to a forensic investigation. Police forces currently lack a simple, robust, efficient and reliable solution to perform this type of swabbing. In this article, we describe a study undertaken to develop a procedure for the collection of ignitable liquid residues on the hands of arson suspects. Sixteen different collection supports were considered and their applicability for the collection of gasoline traces present on hands and their subsequent analysis in a laboratory was evaluated. Background contamination, consisting of volatiles emanating from the collection supports, and collection efficiencies of the different sampling materials were assessed by passive headspace extraction with an activated charcoal strip (DFLEX device) followed by gas chromatography-mass spectrometry (GC-MS) analysis. After statistical treatment of the results, non-powdered latex gloves were retained as the most suitable method of sampling. On the basis of the obtained results, a prototype sampling kit was designed and tested. This kit is made of a three compartment multilayer bag enclosed in a sealed metal can and containing three pairs of non-powdered latex gloves: one to be worn by the sampler, one consisting of a blank sample and the last one to be worn by the person suspected to have been in contact with ignitable liquids. The design of the kit was developed to be efficient in preventing external and cross-contaminations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

L' évaluation quantitative des dangers et des expositions aux nanomatériaux se heurte à de nombreuses incertitudes qui ne seront levées qu'à mesure de la progression des connaissances scientifiques de leurs propriétés. L' une des conséquences de ces incertitudes est que les valeurs limites d'exposition professionnelle définies actuellement pour les poussières ne sont pas nécessairement pertinentes aux nanomatériaux. En l'absence de référentiel quantitatif et, à la demande de la DGS pour éclairer les réflexions de l' AFNOR et de l'ISO sur le sujet, une démarche de gestion graduée des risques (control banding) a été élaborée au sein de l' Anses. Ce développement a été réalisé à l'aide d'un groupe d'experts rapporteurs rattaché au Comité d'experts spécialisés évaluation des risques liés aux agents physiques, aux nouvelles technologies et aux grands aménagements. La mise en oeuvre de la démarche de gestion graduée des risques proposée repose sur quatre grandes étapes: 1. Le recueil des informations. Cette étape consiste à réunir les informations disponibles sur les dangers du nanomatériau manufacturé considéré ; ainsi que sur l'exposition potentielle des personnes aux postes de travail (observation sur le terrain, mesures, etc.). 2. L'attribution d'une bande de danger. Le danger potentiel du nanomatériau manufacturé présent, qu'il soit brut où incorporé dans une matrice (liquide ou solide) est évalué dans cette étape. La bande danger attribuée tient compte de la dangerosité du produit bulk ou de sa substance analogue à l'échelle non-nanométrique, de la bio-persistance du matériau (pour les matériaux fibreux), de sa solubilité et de son éventuelle réactivité. 3. Attribution d'une bande d'exposition. La bande d'exposition du nanomatériau manufacturé considéré ou du produit en contenant est définie par le niveau de potentiel d'émission du produit. Elle tient compte de sa forme physique (solide, liquide, poudre aérosol), de sa pulvérulence et de sa volatilité. Le nombre de travailleurs, la fréquence, la durée d'exposition ainsi que la quantité mise en oeuvre ne sont pas pris en compte, contrairement à une évaluation classique des risques chimiques. 4. Obtention d'une bande de maîtrise des risques. Le croisement des bandes de dangers et d'exposition préalablement attribuées permet de défi nir le niveau de maîtrise du risque. Il fait correspondre les moyens techniques et organisationnels à mettre en oeuvre pour maintenir le risque au niveau le plus faible possible. Un plan d'action est ensuite défi ni pour garantir l'effi cacité de la prévention recommandée par le niveau de maîtrise déterminé. Il tient compte des mesures de prévention déjà existantes et les renforce si nécessaire. Si les mesures indiquées par le niveau de maîtrise de risque ne sont pas réalisables, par exemple, pour des raisons techniques ou budgétaires, une évaluation de risque approfondie devra être réalisée par un expert. La gestion graduée des risques est une méthode alternative pour réaliser une évaluation qualitative de risques et mettre en place des moyens de prévention sans recourir à une évaluation quantitative des risques. Son utilisation semble particulièrement adaptée au contexte des nanomatériaux manufacturés, pour lequel les choix de valeurs de référence (Valeurs limites d'exposition en milieu professionnel) et des techniques de mesurage appropriées souffrent d'une grande incertitude. La démarche proposée repose sur des critères simples, accessibles dans la littérature scientifi que ou via les données techniques relatives aux produits utilisés. Pour autant, sa mise en oeuvre requiert des compétences minimales dans les domaines de la prévention des risques chimiques (chimie, toxicologie, etc.), des nanosciences et des nanotechnologies.