879 resultados para Edge based analysis
Resumo:
In this work, a LED (light emitting diode) based photometer for solid phase photometry is described. The photometer was designed to permit direct coupling of a light source (LED) and a photodiode to a flow cell with an optical pathlength of 4 mm. The flow cell was filled with adsorbing solid phase material (C-18), which was used to immobilize the chromogenic reagent 1-(2-thiazolylazo)-2-naphthol (TAN). Aiming to allow accuracy assessment, samples were also analyzed employing ICP OES (inductively coupled plasma optical emission spectrometry) methodology. Applying the paired t-test at the 95% confidence level, no significant difference was observed. Other useful features were also achieved: linear response ranging from 0.05 to 0.85 mg L-1 Zn, limit of detection of 9 mu g L-1 Zn (3 sigma criterion), standard deviation of 1.4% (n = 10), sampling throughput of 36 determinations per h, and a waste generation and reagent consumption of 1.7 mL and of 0.03 mu g per determination, respectively.
Resumo:
Dimensionality reduction is employed for visual data analysis as a way to obtaining reduced spaces for high dimensional data or to mapping data directly into 2D or 3D spaces. Although techniques have evolved to improve data segregation on reduced or visual spaces, they have limited capabilities for adjusting the results according to user's knowledge. In this paper, we propose a novel approach to handling both dimensionality reduction and visualization of high dimensional data, taking into account user's input. It employs Partial Least Squares (PLS), a statistical tool to perform retrieval of latent spaces focusing on the discriminability of the data. The method employs a training set for building a highly precise model that can then be applied to a much larger data set very effectively. The reduced data set can be exhibited using various existing visualization techniques. The training data is important to code user's knowledge into the loop. However, this work also devises a strategy for calculating PLS reduced spaces when no training data is available. The approach produces increasingly precise visual mappings as the user feeds back his or her knowledge and is capable of working with small and unbalanced training sets.
Resumo:
Organic hydroperoxides are oxidants generated during bacterial-host interactions. Here, we demonstrate that the peroxidase OhrA and its negative regulator OhrR comprise a major pathway for sensing and detoxifying organic hydroperoxides in the opportunistic pathogen Chromobacterium violaceum. Initially, we found that an ohrA mutant was hypersensitive to organic hydroperoxides and that it displayed a low efficiency for decomposing these molecules. Expression of ohrA and ohrR was specifically induced by organic hydroperoxides. These genes were expressed as monocistronic transcripts and also as a bicistronic ohrR-ohrA mRNA, generating the abundantly detected ohrA mRNA and the barely detected ohrR transcript. The bicistronic transcript appears to be processed. OhrR repressed both the ohrA and ohrR genes by binding directly to inverted repeat sequences within their promoters in a redox-dependent manner. Site-directed mutagenesis of each of the four OhrR cysteine residues indicated that the conserved Cys21 is critical to organic hydroperoxide sensing, whereas Cys126 is required for disulfide bond formation. Taken together, these phenotypic, genetic and biochemical data indicate that the response of C. violaceum to organic hydroperoxides is mediated by OhrA and OhrR. Finally, we demonstrated that oxidized OhrR, inactivated by intermolecular disulfide bond formation, is specifically regenerated via thiol-disulfide exchange by thioredoxin (but not other thiol reducing agents such as glutaredoxin, glutathione and lipoamide), providing a physiological reducing system for this thiol-based redox switch.
Resumo:
Abstract Background Identification of nontuberculous mycobacteria (NTM) based on phenotypic tests is time-consuming, labor-intensive, expensive and often provides erroneous or inconclusive results. In the molecular method referred to as PRA-hsp65, a fragment of the hsp65 gene is amplified by PCR and then analyzed by restriction digest; this rapid approach offers the promise of accurate, cost-effective species identification. The aim of this study was to determine whether species identification of NTM using PRA-hsp65 is sufficiently reliable to serve as the routine methodology in a reference laboratory. Results A total of 434 NTM isolates were obtained from 5019 cultures submitted to the Institute Adolpho Lutz, Sao Paulo Brazil, between January 2000 and January 2001. Species identification was performed for all isolates using conventional phenotypic methods and PRA-hsp65. For isolates for which these methods gave discordant results, definitive species identification was obtained by sequencing a 441 bp fragment of hsp65. Phenotypic evaluation and PRA-hsp65 were concordant for 321 (74%) isolates. These assignments were presumed to be correct. For the remaining 113 discordant isolates, definitive identification was based on sequencing a 441 bp fragment of hsp65. PRA-hsp65 identified 30 isolates with hsp65 alleles representing 13 previously unreported PRA-hsp65 patterns. Overall, species identification by PRA-hsp65 was significantly more accurate than by phenotype methods (392 (90.3%) vs. 338 (77.9%), respectively; p < .0001, Fisher's test). Among the 333 isolates representing the most common pathogenic species, PRA-hsp65 provided an incorrect result for only 1.2%. Conclusion PRA-hsp65 is a rapid and highly reliable method and deserves consideration by any clinical microbiology laboratory charged with performing species identification of NTM.
Resumo:
Abstract Background Papaya (Carica papaya L.) is a commercially important crop that produces climacteric fruits with a soft and sweet pulp that contain a wide range of health promoting phytochemicals. Despite its importance, little is known about transcriptional modifications during papaya fruit ripening and their control. In this study we report the analysis of ripe papaya transcriptome by using a cross-species (XSpecies) microarray technique based on the phylogenetic proximity between papaya and Arabidopsis thaliana. Results Papaya transcriptome analyses resulted in the identification of 414 ripening-related genes with some having their expression validated by qPCR. The transcription profile was compared with that from ripening tomato and grape. There were many similarities between papaya and tomato especially with respect to the expression of genes encoding proteins involved in primary metabolism, regulation of transcription, biotic and abiotic stress and cell wall metabolism. XSpecies microarray data indicated that transcription factors (TFs) of the MADS-box, NAC and AP2/ERF gene families were involved in the control of papaya ripening and revealed that cell wall-related gene expression in papaya had similarities to the expression profiles seen in Arabidopsis during hypocotyl development. Conclusion The cross-species array experiment identified a ripening-related set of genes in papaya allowing the comparison of transcription control between papaya and other fruit bearing taxa during the ripening process.
Resumo:
Rare variants are becoming the new candidates in the search for genetic variants that predispose individuals to a phenotype of interest. Their low prevalence in a population requires the development of dedicated detection and analytical methods. A family-based approach could greatly enhance their detection and interpretation because rare variants are nearly family specific. In this report, we test several distinct approaches for analyzing the information provided by rare and common variants and how they can be effectively used to pinpoint putative candidate genes for follow-up studies. The analyses were performed on the mini-exome data set provided by Genetic Analysis Workshop 17. Eight approaches were tested, four using the trait’s heritability estimates and four using QTDT models. These methods had their sensitivity, specificity, and positive and negative predictive values compared in light of the simulation parameters. Our results highlight important limitations of current methods to deal with rare and common variants, all methods presented a reduced specificity and, consequently, prone to false positive associations. Methods analyzing common variants information showed an enhanced sensibility when compared to rare variants methods. Furthermore, our limited knowledge of the use of biological databases for gene annotations, possibly for use as covariates in regression models, imposes a barrier to further research.
Resumo:
In this paper, a procedure for the on-line process control of variables is proposed. This procedure consists of inspecting the m-th item from every m produced items and deciding, at each inspection, whether the process is out-of-control. Two sets of limits, warning (µ0 ± W) and control (µ0 ± C), are used. If the value of the monitored statistic falls beyond the control limits or if a sequence of h observations falls between the warning limits and the control limits, the production is stopped for adjustment; otherwise, production goes on. The properties of an ergodic Markov chain are used to obtain an expression for the average cost per item. The parameters (the sampling interval m, the widths of the warning, the control limits W and C(W < C), and the sequence length (h) are optimized by minimizing the cost function. A numerical example illustrates the proposed procedure.
Resumo:
Neste trabalho é proposto um fotômetro baseado em LED (diodo emissor de luz) para fotometria em fase sólida. O fotômetro foi desenvolvido para permitir o acoplamento da fonte de radiação (LED) e do fotodetector direto na cela de fluxo, tendo um caminho óptico de 4 mm. A cela de fluxo foi preenchida com material sólido (C18), o qual foi utilizado para imobilizar o reagente cromogênico 1-(2-tiazolilazo)-2-naftol (TAN). A exatidão foi avaliada empregando dados obtidos através da técnica ICP OES (espectrometria de emissão por plasma indutivamente acoplado). Aplicando-se o teste-t pareado não foi observada diferença significativa em nível de confiança de 95%. Outros parâmetros importantes encontrados foram faixa de resposta linear de 0,05 a 0,85 mg L-1 Zn, limite de detecção de 9 µg L-1 Zn (n = 3), desvio padrão de 1,4 % (n = 10), frequência de amostragem de 36 determinações por h, e uma geração de efluente e consumo de reagente de 1,7 mL e 0,03 µg por determinação, respectivamente.
Resumo:
OBJECTIVE: The frequent occurrence of inconclusive serology in blood banks and the absence of a gold standard test for Chagas'disease led us to examine the efficacy of the blood culture test and five commercial tests (ELISA, IIF, HAI, c-ELISA, rec-ELISA) used in screening blood donors for Chagas disease, as well as to investigate the prevalence of Trypanosoma cruzi infection among donors with inconclusive serology screening in respect to some epidemiological variables. METHODS: To obtain estimates of interest we considered a Bayesian latent class model with inclusion of covariates from the logit link. RESULTS: A better performance was observed with some categories of epidemiological variables. In addition, all pairs of tests (excluding the blood culture test) presented as good alternatives for both screening (sensitivity > 99.96% in parallel testing) and for confirmation (specificity > 99.93% in serial testing) of Chagas disease. The prevalence of 13.30% observed in the stratum of donors with inconclusive serology, means that probably most of these are non-reactive serology. In addition, depending on the level of specific epidemiological variables, the absence of infection can be predicted with a probability of 100% in this group from the pairs of tests using parallel testing. CONCLUSION: The epidemiological variables can lead to improved test results and thus assist in the clarification of inconclusive serology screening results. Moreover, all combinations of pairs using the five commercial tests are good alternatives to confirm results.
Resumo:
Oil spills are potential threats to the integrity of highly productive coastal wetlands, such as mangrove forests. In October 1983, a mangrove area of nearly 300 ha located on the southeastern coast of Brazil was impacted by a 3.5 million liter crude oil spill released by a broken pipeline. In order to assess the long-term effects of oil pollution on mangrove vegetation, we carried out a GIS-based multitemporal analysis of aerial photographs of the years 1962, 1994, 2000 and 2003. Photointerpretation, visual classification, class quantification, ground-truth and vegetation structure data were combined to evaluate the oil impact. Before the spill, the mangroves exhibited a homogeneous canopy and well-developed stands. More than ten years after the spill, the mangrove vegetation exhibited three distinct zones reflecting the long-term effects of the oil pollution. The most impacted zone (10.5 ha) presented dead trees, exposed substrate and recovering stands with reduced structural development. We suggest that the distinct impact and recovery zones reflect the spatial variability of oil removal rates in the mangrove forest. This study identifies the multitemporal analysis of aerial photographs as a useful tool for assessing a system's capacity for recovery and monitoring the long-term residual effects of pollutants on vegetation dynamics, thus giving support to mangrove forest management and conservation.
Resumo:
Reinforced concrete beam elements are submitted to applicable loads along their life cycle that cause shear and torsion. These elements may be subject to only shear, pure torsion or both, torsion and shear combined. The Brazilian Standard Code ABNT NBR 6118:2007 [1] fixes conditions to calculate the transverse reinforcement area in beam reinforced concrete elements, using two design models, based on the strut and tie analogy model, first studied by Mörsch [2]. The strut angle θ (theta) can be considered constant and equal to 45º (Model I), or varying between 30º and 45º (Model II). In the case of transversal ties (stirrups), the variation of angle α (alpha) is between 45º and 90º. When the equilibrium torsion is required, a resistant model based on space truss with hollow section is considered. The space truss admits an inclination angle θ between 30º and 45º, in accordance with beam elements subjected to shear. This paper presents a theoretical study of models I and II for combined shear and torsion, in which ranges the geometry and intensity of action in reinforced concrete beams, aimed to verify the consumption of transverse reinforcement in accordance with the calculation model adopted As the strut angle on model II ranges from 30º to 45º, transverse reinforcement area (Asw) decreases, and total reinforcement area, which includes longitudinal torsion reinforcement (Asℓ), increases. It appears that, when considering model II with strut angle above 40º, under shear only, transverse reinforcement area increases 22% compared to values obtained using model I.
Resumo:
In Performance-Based Earthquake Engineering (PBEE), evaluating the seismic performance (or seismic risk) of a structure at a designed site has gained major attention, especially in the past decade. One of the objectives in PBEE is to quantify the seismic reliability of a structure (due to the future random earthquakes) at a site. For that purpose, Probabilistic Seismic Demand Analysis (PSDA) is utilized as a tool to estimate the Mean Annual Frequency (MAF) of exceeding a specified value of a structural Engineering Demand Parameter (EDP). This dissertation focuses mainly on applying an average of a certain number of spectral acceleration ordinates in a certain interval of periods, Sa,avg (T1,…,Tn), as scalar ground motion Intensity Measure (IM) when assessing the seismic performance of inelastic structures. Since the interval of periods where computing Sa,avg is related to the more or less influence of higher vibration modes on the inelastic response, it is appropriate to speak about improved IMs. The results using these improved IMs are compared with a conventional elastic-based scalar IMs (e.g., pseudo spectral acceleration, Sa ( T(¹)), or peak ground acceleration, PGA) and the advanced inelastic-based scalar IM (i.e., inelastic spectral displacement, Sdi). The advantages of applying improved IMs are: (i ) "computability" of the seismic hazard according to traditional Probabilistic Seismic Hazard Analysis (PSHA), because ground motion prediction models are already available for Sa (Ti), and hence it is possibile to employ existing models to assess hazard in terms of Sa,avg, and (ii ) "efficiency" or smaller variability of structural response, which was minimized to assess the optimal range to compute Sa,avg. More work is needed to assess also "sufficiency" and "scaling robustness" desirable properties, which are disregarded in this dissertation. However, for ordinary records (i.e., with no pulse like effects), using the improved IMs is found to be more accurate than using the elastic- and inelastic-based IMs. For structural demands that are dominated by the first mode of vibration, using Sa,avg can be negligible relative to the conventionally-used Sa (T(¹)) and the advanced Sdi. For structural demands with sign.cant higher-mode contribution, an improved scalar IM that incorporates higher modes needs to be utilized. In order to fully understand the influence of the IM on the seismis risk, a simplified closed-form expression for the probability of exceeding a limit state capacity was chosen as a reliability measure under seismic excitations and implemented for Reinforced Concrete (RC) frame structures. This closed-form expression is partuclarly useful for seismic assessment and design of structures, taking into account the uncertainty in the generic variables, structural "demand" and "capacity" as well as the uncertainty in seismic excitations. The assumed framework employs nonlinear Incremental Dynamic Analysis (IDA) procedures in order to estimate variability in the response of the structure (demand) to seismic excitations, conditioned to IM. The estimation of the seismic risk using the simplified closed-form expression is affected by IM, because the final seismic risk is not constant, but with the same order of magnitude. Possible reasons concern the non-linear model assumed, or the insufficiency of the selected IM. Since it is impossibile to state what is the "real" probability of exceeding a limit state looking the total risk, the only way is represented by the optimization of the desirable properties of an IM.
Resumo:
The aim of this thesis was to describe the development of motion analysis protocols for applications on upper and lower limb extremities, by using inertial sensors-based systems. Inertial sensors-based systems are relatively recent. Knowledge and development of methods and algorithms for the use of such systems for clinical purposes is therefore limited if compared with stereophotogrammetry. However, their advantages in terms of low cost, portability, small size, are a valid reason to follow this direction. When developing motion analysis protocols based on inertial sensors, attention must be given to several aspects, like the accuracy of inertial sensors-based systems and their reliability. The need to develop specific algorithms/methods and software for using these systems for specific applications, is as much important as the development of motion analysis protocols based on them. For this reason, the goal of the 3-years research project described in this thesis was achieved first of all trying to correctly design the protocols based on inertial sensors, in terms of exploring and developing which features were suitable for the specific application of the protocols. The use of optoelectronic systems was necessary because they provided a gold standard and accurate measurement, which was used as a reference for the validation of the protocols based on inertial sensors. The protocols described in this thesis can be particularly helpful for rehabilitation centers in which the high cost of instrumentation or the limited working areas do not allow the use of stereophotogrammetry. Moreover, many applications requiring upper and lower limb motion analysis to be performed outside the laboratories will benefit from these protocols, for example performing gait analysis along the corridors. Out of the buildings, the condition of steady-state walking or the behavior of the prosthetic devices when encountering slopes or obstacles during walking can also be assessed. The application of inertial sensors on lower limb amputees presents conditions which are challenging for magnetometer-based systems, due to ferromagnetic material commonly adopted for the construction of idraulic components or motors. INAIL Prostheses Centre stimulated and, together with Xsens Technologies B.V. supported the development of additional methods for improving the accuracy of MTx in measuring the 3D kinematics for lower limb prostheses, with the results provided in this thesis. In the author’s opinion, this thesis and the motion analysis protocols based on inertial sensors here described, are a demonstration of how a strict collaboration between the industry, the clinical centers, the research laboratories, can improve the knowledge, exchange know-how, with the common goal to develop new application-oriented systems.
Resumo:
The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.