815 resultados para Comparative Genomics, Non-coding RNAs, Conservation, Segmentation, Change-points, Sliding Window Analysis, Markov Chain Monte Carlo, Bayesian modeling
Resumo:
Physalis mottle virus (PhMV) belongs to the tymogroup of positive-strand RNA viruses with a genome size of 6 kb. Crude membrane preparations from PhMV-infected Nicotiana glutinosa plants catalyzed the synthesis of PhMV genomic RNA from endogenously bound template. Addition of exogenous genomic RNA enhanced the synthesis which was specifically inhibited by the addition of sense and antisense transcripts corresponding to 3' terminal 242 nucleotides as well as the 5' terminal 458 nucleotides of PhMV genomic RNA while yeast tRNA or ribosomal RNA failed to inhibit the synthesis. This specific inhibition suggested that the 5' and 3' non-coding regions of PhMV RNA might play an important role in viral replication.
Resumo:
A methodology termed the “filtered density function” (FDF) is developed and implemented for large eddy simulation (LES) of chemically reacting turbulent flows. In this methodology, the effects of the unresolved scalar fluctuations are taken into account by considering the probability density function (PDF) of subgrid scale (SGS) scalar quantities. A transport equation is derived for the FDF in which the effect of chemical reactions appears in a closed form. The influences of scalar mixing and convection within the subgrid are modeled. The FDF transport equation is solved numerically via a Lagrangian Monte Carlo scheme in which the solutions of the equivalent stochastic differential equations (SDEs) are obtained. These solutions preserve the Itô-Gikhman nature of the SDEs. The consistency of the FDF approach, the convergence of its Monte Carlo solution and the performance of the closures employed in the FDF transport equation are assessed by comparisons with results obtained by direct numerical simulation (DNS) and by conventional LES procedures in which the first two SGS scalar moments are obtained by a finite difference method (LES-FD). These comparative assessments are conducted by implementations of all three schemes (FDF, DNS and LES-FD) in a temporally developing mixing layer and a spatially developing planar jet under both non-reacting and reacting conditions. In non-reacting flows, the Monte Carlo solution of the FDF yields results similar to those via LES-FD. The advantage of the FDF is demonstrated by its use in reacting flows. In the absence of a closure for the SGS scalar fluctuations, the LES-FD results are significantly different from those based on DNS. The FDF results show a much closer agreement with filtered DNS results. © 1998 American Institute of Physics.
Resumo:
Takifugu rubripes is teleost fish widely used in comparative genomics to understand the human system better due to its similarities both in number of genes and structure of genes. In this work we survey the fugu genome, and, using sensitive computational approaches, we identify the repertoire of putative protein kinases and classify them into groups and subfamilies. The fugu genome encodes 519 protein kinase-like sequences and this number of putative protein kinases is comparable closely to that of human. However, in spite of its similarities to human kinases at the group level, there are differences at the subfamily level as noted in the case of KIS and DYRK subfamilies which contribute to differences which are specific to the adaptation of the organism. Also, certain unique domain combination of galectin domain and YkA domain suggests alternate mechanisms for immune response and binding to lipoproteins. Lastly, an overall similarity with the MAPK pathway of humans suggests its importance to understand signaling mechanisms in humans. Overall the fugu serves as a good model organism to understand roles of human kinases as far as kinases such as LRRK and IRAK and their associated pathways are concerned.
Resumo:
The problem of time variant reliability analysis of randomly parametered and randomly driven nonlinear vibrating systems is considered. The study combines two Monte Carlo variance reduction strategies into a single framework to tackle the problem. The first of these strategies is based on the application of the Girsanov transformation to account for the randomness in dynamic excitations, and the second approach is fashioned after the subset simulation method to deal with randomness in system parameters. Illustrative examples include study of single/multi degree of freedom linear/non-linear inelastic randomly parametered building frame models driven by stationary/non-stationary, white/filtered white noise support acceleration. The estimated reliability measures are demonstrated to compare well with results from direct Monte Carlo simulations. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
Numerical simulations were conducted to study thermocapillary flows in short half-zone liquid bridges of molten tin with Prandtl number Pr = 0.009, under ramped temperature difference. The spatio-temporal structures in the thermocapillary flows in short half-zone liquid bridges with aspect ratios As = 0.6, 0.8, and 1.0 were investigated. The first critical Marangoni numbers were compared with those predicted by linear stability analyses (LSA). The second critical Marangoni numbers for As = 0.6 and 0.8 were found to be larger than that for As = 1.0. The time evolutions of the thermocapillary flows exhibited unusual features such as a change in the azimuthal wave number during the three-dimensional stationary (non-oscillating) flow regime, a change in the oscillation mode during the three-dimensional oscillatory flow regime, and the decreasing and then increasing of amplitudes in a single oscillation mode. The effects of the ramping rate of the temperature difference on the flow modes and critical conditions were studied as well. In this paper, the experimental observability of the critical conditions was also discussed. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
An investigation has been made into the effect of microstructural parameters on the propensity for forming shear localization produced during high speed torsional testing by split Hopkinson bar with different average rates of 610, 650 and 1500 s(-1) in low carbon steels. These steels received the quenched, quenched and tempered as well as normalized treatments that provide wide microstructural parameters and mechanical properties. The results indicate that the occurrence of the shear localization is susceptible to the strength of the steels. In other words, the tendency of the quenched steel to form a shear band is higher than that of the other two steels. It is also found that there is a critical strain at which the shear localization occurs in the steels. The critical strain value is strongly dependent on the strength of the steels. Before arriving at this point, the material undergoes a slow work-hardening. After this point, the material suffers work-softening, corresponding to a process during which the deformation is gradually localized and eventually becomes spatially correlated to form a macroscopic shear band. Examinations by SEM reveal that the shear localization within the band involves a series of sequential crystallographic and non-crystallographic events including the change in crystal orientation, misorientation, generation and even perhaps damage in microstructures such as the initiation, growth and coalescence of the microcracks. It is expected that the sharp drop in the load-carrying capacity is associated with the growth and coalescence of the microcracks rather than the occurrence of the shear localization, but the shear localization is seen to accelerate the growth and coalescence of the microcracks. The thin foil observations by TEM reveal that the density of dislocations in the band is extremely high and the tangled arrangement and cell structure of dislocations tends to align along the shear direction. The multiplication and interaction of dislocations seems to be responsible for work-hardening of the steels. The avalanche of the dislocation cells corresponds to the sharp drop in shear stress at which the deformed specimen is broken. Double shear bands and kink bands are also observed in the present study. The principal band develops first and its width is narrower than that of the secondary band.
Resumo:
The work presented in this thesis revolves around erasure correction coding, as applied to distributed data storage and real-time streaming communications.
First, we examine the problem of allocating a given storage budget over a set of nodes for maximum reliability. The objective is to find an allocation of the budget that maximizes the probability of successful recovery by a data collector accessing a random subset of the nodes. This optimization problem is challenging in general because of its combinatorial nature, despite its simple formulation. We study several variations of the problem, assuming different allocation models and access models, and determine the optimal allocation and the optimal symmetric allocation (in which all nonempty nodes store the same amount of data) for a variety of cases. Although the optimal allocation can have nonintuitive structure and can be difficult to find in general, our results suggest that, as a simple heuristic, reliable storage can be achieved by spreading the budget maximally over all nodes when the budget is large, and spreading it minimally over a few nodes when it is small. Coding would therefore be beneficial in the former case, while uncoded replication would suffice in the latter case.
Second, we study how distributed storage allocations affect the recovery delay in a mobile setting. Specifically, two recovery delay optimization problems are considered for a network of mobile storage nodes: the maximization of the probability of successful recovery by a given deadline, and the minimization of the expected recovery delay. We show that the first problem is closely related to the earlier allocation problem, and solve the second problem completely for the case of symmetric allocations. It turns out that the optimal allocations for the two problems can be quite different. In a simulation study, we evaluated the performance of a simple data dissemination and storage protocol for mobile delay-tolerant networks, and observed that the choice of allocation can have a significant impact on the recovery delay under a variety of scenarios.
Third, we consider a real-time streaming system where messages created at regular time intervals at a source are encoded for transmission to a receiver over a packet erasure link; the receiver must subsequently decode each message within a given delay from its creation time. For erasure models containing a limited number of erasures per coding window, per sliding window, and containing erasure bursts whose maximum length is sufficiently short or long, we show that a time-invariant intrasession code asymptotically achieves the maximum message size among all codes that allow decoding under all admissible erasure patterns. For the bursty erasure model, we also show that diagonally interleaved codes derived from specific systematic block codes are asymptotically optimal over all codes in certain cases. We also study an i.i.d. erasure model in which each transmitted packet is erased independently with the same probability; the objective is to maximize the decoding probability for a given message size. We derive an upper bound on the decoding probability for any time-invariant code, and show that the gap between this bound and the performance of a family of time-invariant intrasession codes is small when the message size and packet erasure probability are small. In a simulation study, these codes performed well against a family of random time-invariant convolutional codes under a number of scenarios.
Finally, we consider the joint problems of routing and caching for named data networking. We propose a backpressure-based policy that employs virtual interest packets to make routing and caching decisions. In a packet-level simulation, the proposed policy outperformed a basic protocol that combines shortest-path routing with least-recently-used (LRU) cache replacement.
Resumo:
A preliminary survey was conducted among the fishermen in five selected villages in Edozhigi L.G.A. of Niger State. One hundred and fifty fishermen were randomly selected and interviewed to find out the impact of Niger State fisheries legislation on fisheries conservation resources in the area. The analysis of data collected using descriptive statistics indicated that undersized mesh of gill nets, beach seines and traps are being used unabated. Also, fenced barriers across the entrance of flood plain ponds and Ex-bow lakes from the main stream are in the area. The fisheries rules and regulations implementers are rarely seen or not seen at all in the area. The decreasing nature of fish catches was detected. It is observed that government policy on fish conversation is neglected due to inadequate or lack of funding for meaningful extension and implementation of the fisheries rules and regulations
Resumo:
The study assessed qualitatively the threat status of all nigerian freshwater fishes using such criteria as rarity, size at maturity, mode of reproduction, human population density, habitat degradation, pollution and range of each species among others. The biology of 48% (129n) of nigerian freshwater species is not well known. Of the 266 known freshwater fishes, 47 species represented 17% are critically endangered, 15 (5%) are endangered , 8(3%), are vulnerable while 23(8%) are near threatened. The paper suggests increased basic knowledge of threatened species and conservation policy along three lines public awareness, legislation and creation of national parks, aquaria and reserves as measures needed to ensure the conservation of the fishes
Resumo:
52 p.
Resumo:
Solomon Islands has recently developed substantial policy aiming to support inshore fisheries management, conservation, climate change adaptation and ecosystem approaches to resource management. A large body of experience in community based approaches to management has developed but “upscaling” and particularly the implementation of nation-wide approaches has received little attention so far. With the emerging challenges posed by climate change and the need for ecosystem wide and integrated approaches attracting serious donor attention, a national debate on the most effective approaches to implementation is urgently needed. This report discusses potential implementation of “a cost-effective and integrated approach to resource management that is consistent with national policy and needs” based on a review of current policy and institutional structures and examination of a recent case study from Lau, Malaita using stakeholder, transaction and financial cost analyses.
Resumo:
Genome wide association studies (GWAS) have identified several low-penetrance susceptibility alleles in chronic lymphocytic leukemia (CLL). Nevertheless, these studies scarcely study regions that are implicated in non-coding molecules such as microRNAs (miRNAs). Abnormalities in miRNAs, as altered expression patterns and mutations, have been described in CLL, suggesting their implication in the development of the disease. Genetic variations in miRNAs can affect levels of miRNA expression if present in pre-miRNAs and in miRNA biogenesis genes or alter miRNA function if present in both target mRNA and miRNA sequences. Therefore, the present study aimed to evaluate whether polymorphisms in pre-miRNAs, and/or miRNA processing genes contribute to predisposition for CLL. A total of 91 SNPs in 107 CLL patients and 350 cancer-free controls were successfully analyzed using TaqMan Open Array technology. We found nine statistically significant associations with CLL risk after FDR correction, seven in miRNA processing genes (rs3805500 and rs6877842 in DROSHA, rs1057035 in DICER1, rs17676986 in SND1, rs9611280 in TNRC6B, rs784567 in TRBP and rs11866002 in CNOT1) and two in pre-miRNAs (rs11614913 in miR196a2 and rs2114358 in miR1206). These findings suggest that polymorphisms in genes involved in miRNAs biogenesis pathway as well as in pre-miRNAs contribute to the risk of CLL. Large-scale studies are needed to validate the current findings.
Resumo:
A tese analisa decisões judiciais prolatadas em casos da bioética clínica, especificamente: requerimento de autorização para interrupção de gestação de feto anencéfalo, liberdade de recusa à imposição de procedimento de transfusão de sangue em razão de crença religiosa em paciente Testemunha de Jeová e a mudança de nome e sexo de transexual com ou sem realização de cirurgia de transgenitalismo. A escolha dos três tipos de casos levados a julgamento ao Poder Judiciário se deu em virtude de serem questões características ao direito existencial, de repercussão no Ser do indivíduo, em seus direitos personalíssimos. Para isso foram analisadas 84 decisões judiciais, mediante a aplicação da teoria Principiológica de Beauchamp & Childress e análise de cada decisão quanto à aplicação dos quatro Princípios que desenvolve: do respeito à autonomia, da não maleficência, da beneficência e da justiça. O resultado da análise demonstrou que ao utilizar os quatro Princípios, com especificação e ponderação dos mesmos, o julgador profere decisões de cunho liberal. Quando não utiliza os Princípios ou extrapola os limites de sua aplicação, o julgador profere decisões de cunho conservador. As decisões judiciais de caráter liberal são despidas de preconceitos e moralismos e permitem o respeito aos direitos individuais sem descuidar dos direitos dos demais membros da sociedade. As decisões conservadoras se baseiam na literalidade da lei e violam direitos individuais, sem acrescentar segurança à sociedade. A apropriação desta teoria da ética biomédica pelo biodireito, se apresenta como método seguro e eficaz na prolação de decisão judicial em casos da bioética clínica e conduz o julgador a decisões mais justas por serem apoiadas em boas razões.
Resumo:
Shrimp fishermen trawling in the Gulf of Mexico and south Atlantic inadvertently capture and kill sea turtles which are classified as endangered species. Recent legislation requires the use of a Turtle Excluder Device(TED) which, when in place in the shrimp trawl, reduces sea turtle mortality. The impact of the TED on shrimp production is not known. This intermediate analysis of the TED regulations using an annual firm level simulation model indicated that the average Texas shrimp vessel had a low probability of being an economic success before regulations were enacted. An assumption that the TED regulations resulted in decreased production aggravated this condition and the change in Ending Net Worth and Net Present Value of Ending Net Worth before and after a TED was placed in the net was significant at the 5 percent level. However, the difference in the Internal Rate of Return for the TED and non-TED simulations was not significant unless the TED caused a substantial change in catch. This analysis did not allow for interactions between the fishermen in the shrimp industry, an assumption which could significantly alter the impact of TED use on the catch and earnings of the individual shrimp vessel.