595 resultados para Rec A Recombinases
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Pós-graduação em Genética e Melhoramento Animal - FCAV
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Order Cetacea, Suborder Odontoceti, Superfamily Delphinoidea, Family Phocoenidae. The genus Phocoena now includes four species. No subspecies are rec- ognized in P. spinipinnis.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Outer mitochondrial membrane (OMM) rupture was first noted in isolated mitochondria in which the inner mitochondrial membrane (IMM) had lost its selective permeability. This phenomenon referred to as mitochondrial permeability transition (MPT) refers to a permeabilized inner membrane that originates a large swelling in the mitochondrial matrix, which distends the outer membrane until it ruptures. Here, we have expanded previous electron microscopic observations that in apoptotic cells, OMM rupture is not caused by a membrane stretching promoted by a markedly swollen matrix. It is shown that the widths of the ruptured regions of the OMM vary from 6 to 250 nm. Independent of the perforation size, herniation of the mitochondrial matrix appeared to have resulted in pushing the IMM through the perforation. A large, long focal herniation of the mitochondrial matrix, covered with the IMM, was associated with a rupture of the OMM that was as small as 6 nm. Contextually, the collapse of the selective permeability of the IMM may precede or follow the release of the mitochondrial proteins of the intermembrane space into the cytoplasm. When the MPT is a late event, exit of the intermembrane space proteins to the cytoplasm is unimpeded and occurs through channels that transverse the outer membrane, because so far, the inner membrane is impermeable. No channel within the outer membrane can expose to the cytoplasm a permeable inner membrane, because it would serve as a conduit for local herniation of the mitochondrial matrix. Anat Rec, 2012. (c) 2012 Wiley Periodicals, Inc.
Resumo:
We study a model of fast magnetic reconnection in the presence of weak turbulence proposed by Lazarian and Vishniac (1999) using three-dimensional direct numerical simulations. The model has been already successfully tested in Kowal et al. (2009) confirming the dependencies of the reconnection speed V-rec on the turbulence injection power P-inj and the injection scale l(inj) expressed by a constraint V-rec similar to P(inj)(1/2)l(inj)(3/4)and no observed dependency on Ohmic resistivity. In Kowal et al. (2009), in order to drive turbulence, we injected velocity fluctuations in Fourier space with frequencies concentrated around k(inj) = 1/l(inj), as described in Alvelius (1999). In this paper, we extend our previous studies by comparing fast magnetic reconnection under different mechanisms of turbulence injection by introducing a new way of turbulence driving. The new method injects velocity or magnetic eddies with a specified amplitude and scale in random locations directly in real space. We provide exact relations between the eddy parameters and turbulent power and injection scale. We performed simulations with new forcing in order to study turbulent power and injection scale dependencies. The results show no discrepancy between models with two different methods of turbulence driving exposing the same scalings in both cases. This is in agreement with the Lazarian and Vishniac (1999) predictions. In addition, we performed a series of models with varying viscosity nu. Although Lazarian and Vishniac (1999) do not provide any prediction for this dependence, we report a weak relation between the reconnection speed with viscosity, V-rec similar to nu(-1/4).
Resumo:
We propose a new general Bayesian latent class model for evaluation of the performance of multiple diagnostic tests in situations in which no gold standard test exists based on a computationally intensive approach. The modeling represents an interesting and suitable alternative to models with complex structures that involve the general case of several conditionally independent diagnostic tests, covariates, and strata with different disease prevalences. The technique of stratifying the population according to different disease prevalence rates does not add further marked complexity to the modeling, but it makes the model more flexible and interpretable. To illustrate the general model proposed, we evaluate the performance of six diagnostic screening tests for Chagas disease considering some epidemiological variables. Serology at the time of donation (negative, positive, inconclusive) was considered as a factor of stratification in the model. The general model with stratification of the population performed better in comparison with its concurrents without stratification. The group formed by the testing laboratory Biomanguinhos FIOCRUZ-kit (c-ELISA and rec-ELISA) is the best option in the confirmation process by presenting false-negative rate of 0.0002% from the serial scheme. We are 100% sure that the donor is healthy when these two tests have negative results and he is chagasic when they have positive results.
Resumo:
OBJECTIVE: The frequent occurrence of inconclusive serology in blood banks and the absence of a gold standard test for Chagas'disease led us to examine the efficacy of the blood culture test and five commercial tests (ELISA, IIF, HAI, c-ELISA, rec-ELISA) used in screening blood donors for Chagas disease, as well as to investigate the prevalence of Trypanosoma cruzi infection among donors with inconclusive serology screening in respect to some epidemiological variables. METHODS: To obtain estimates of interest we considered a Bayesian latent class model with inclusion of covariates from the logit link. RESULTS: A better performance was observed with some categories of epidemiological variables. In addition, all pairs of tests (excluding the blood culture test) presented as good alternatives for both screening (sensitivity > 99.96% in parallel testing) and for confirmation (specificity > 99.93% in serial testing) of Chagas disease. The prevalence of 13.30% observed in the stratum of donors with inconclusive serology, means that probably most of these are non-reactive serology. In addition, depending on the level of specific epidemiological variables, the absence of infection can be predicted with a probability of 100% in this group from the pairs of tests using parallel testing. CONCLUSION: The epidemiological variables can lead to improved test results and thus assist in the clarification of inconclusive serology screening results. Moreover, all combinations of pairs using the five commercial tests are good alternatives to confirm results.
Resumo:
A regional envelope curve (REC) of flood flows summarises the current bound on our experience of extreme floods in a region. RECs are available for most regions of the world. Recent scientific papers introduced a probabilistic interpretation of these curves and formulated an empirical estimator of the recurrence interval T associated with a REC, which, in principle, enables us to use RECs for design purposes in ungauged basins. The main aim of this work is twofold. First, it extends the REC concept to extreme rainstorm events by introducing the Depth-Duration Envelope Curves (DDEC), which are defined as the regional upper bound on all the record rainfall depths at present for various rainfall duration. Second, it adapts the probabilistic interpretation proposed for RECs to DDECs and it assesses the suitability of these curves for estimating the T-year rainfall event associated with a given duration and large T values. Probabilistic DDECs are complementary to regional frequency analysis of rainstorms and their utilization in combination with a suitable rainfall-runoff model can provide useful indications on the magnitude of extreme floods for gauged and ungauged basins. The study focuses on two different national datasets, the peak over threshold (POT) series of rainfall depths with duration 30 min., 1, 3, 9 and 24 hrs. obtained for 700 Austrian raingauges and the Annual Maximum Series (AMS) of rainfall depths with duration spanning from 5 min. to 24 hrs. collected at 220 raingauges located in northern-central Italy. The estimation of the recurrence interval of DDEC requires the quantification of the equivalent number of independent data which, in turn, is a function of the cross-correlation among sequences. While the quantification and modelling of intersite dependence is a straightforward task for AMS series, it may be cumbersome for POT series. This paper proposes a possible approach to address this problem.
Resumo:
An Adaptive Optic (AO) system is a fundamental requirement of 8m-class telescopes. We know that in order to obtain the maximum possible resolution allowed by these telescopes we need to correct the atmospheric turbulence. Thanks to adaptive optic systems we are able to use all the effective potential of these instruments, drawing all the information from the universe sources as best as possible. In an AO system there are two main components: the wavefront sensor (WFS) that is able to measure the aberrations on the incoming wavefront in the telescope, and the deformable mirror (DM) that is able to assume a shape opposite to the one measured by the sensor. The two subsystem are connected by the reconstructor (REC). In order to do this, the REC requires a “common language" between these two main AO components. It means that it needs a mapping between the sensor-space and the mirror-space, called an interaction matrix (IM). Therefore, in order to operate correctly, an AO system has a main requirement: the measure of an IM in order to obtain a calibration of the whole AO system. The IM measurement is a 'mile stone' for an AO system and must be done regardless of the telescope size or class. Usually, this calibration step is done adding to the telescope system an auxiliary artificial source of light (i.e a fiber) that illuminates both the deformable mirror and the sensor, permitting the calibration of the AO system. For large telescope (more than 8m, like Extremely Large Telescopes, ELTs) the fiber based IM measurement requires challenging optical setups that in some cases are also impractical to build. In these cases, new techniques to measure the IM are needed. In this PhD work we want to check the possibility of a different method of calibration that can be applied directly on sky, at the telescope, without any auxiliary source. Such a technique can be used to calibrate AO system on a telescope of any size. We want to test the new calibration technique, called “sinusoidal modulation technique”, on the Large Binocular Telescope (LBT) AO system, which is already a complete AO system with the two main components: a secondary deformable mirror with by 672 actuators, and a pyramid wavefront sensor. My first phase of PhD work was helping to implement the WFS board (containing the pyramid sensor and all the auxiliary optical components) working both optical alignments and tests of some optical components. Thanks to the “solar tower” facility of the Astrophysical Observatory of Arcetri (Firenze), we have been able to reproduce an environment very similar to the telescope one, testing the main LBT AO components: the pyramid sensor and the secondary deformable mirror. Thanks to this the second phase of my PhD thesis: the measure of IM applying the sinusoidal modulation technique. At first we have measured the IM using a fiber auxiliary source to calibrate the system, without any kind of disturbance injected. After that, we have tried to use this calibration technique in order to measure the IM directly “on sky”, so adding an atmospheric disturbance to the AO system. The results obtained in this PhD work measuring the IM directly in the Arcetri solar tower system are crucial for the future development: the possibility of the acquisition of IM directly on sky means that we are able to calibrate an AO system also for extremely large telescope class where classic IM measurements technique are problematic and, sometimes, impossible. Finally we have not to forget the reason why we need this: the main aim is to observe the universe. Thanks to these new big class of telescopes and only using their full capabilities, we will be able to increase our knowledge of the universe objects observed, because we will be able to resolve more detailed characteristics, discovering, analyzing and understanding the behavior of the universe components.
Resumo:
La presenza di micotossine nelle materie prime desta grande preoccupazione a causa delle importanti implicazioni nella sicurezza di alimenti e mangimi. Lo scopo di questo lavoro è stato quello di mettere a punto e validare una metodica analitica rapida e semplice, in cromatografia liquida ad ultra prestazione accoppiata a spettrometria di massa-tandem (UPLC-MS/MS), per la determinazione simultanea di differenti micotossine: aflatossine (B1, B2, G1, G2), ocratossina A, fumonisine (B1, B2), deossinivalenolo e zearalenone in matrici biologiche. Il metodo sviluppato per l’analisi di campioni di mangime secco per cani ha mostrato prestazioni adeguate ed è stato applicato a 49 campioni reperibili in commercio, al fine di valutare la sua efficacia e di ottenere alcuni dati preliminari sulla contaminazione da micotossine in alimenti per cani disponibili sul mercato italiano. Lo studio ha evidenziato una percentuale alta di campioni positivi, contenenti principalmente fumonisine, deossinivalenolo e ocratossina A; tutti i tenori si sono dimostrati inferiori al limite di legge previsto (Racc. CE 576/2006). Una seconda metodica è stata messa a punto e validata per l’identificazione e la quantificazione micotossine in campioni di formaggio; per questa matrice è stata inserita anche l’aflatossina M1, specifica dei prodotti lattiero - caseari. Le differenti proprietà chimico-fisiche degli analiti e la complessità della matrice hanno implicato alcune difficoltà nello sviluppo della metodica. Tuttavia, il metodo validato si è mostrato rapido, semplice ed affidabile ed è stato applicato a diversi tipi di formaggi per verificarne la versatilità. I risultati preliminari hanno mostrato l’assenza di contaminazione da parte delle micotossine in oggetto. Entrambi i metodi si sono dimostrati utili per il monitoraggio di contaminanti in matrici complesse ad oggi ancora poco studiate.