952 resultados para r-functions
Resumo:
The basic requirement for an autopilot is fast response and minimum steady state error for better guidance performance. The highly nonlinear nature of the missile dynamics due to the severe kinematic and inertial coupling of the missile airframe as well as the aerodynamics has been a challenge for an autopilot that is required to have satisfactory performance for all flight conditions in probable engagements. Dynamic inversion is very popular nonlinear controller for this kind of scenario. But the drawback of this controller is that it is sensitive to parameter perturbation. To overcome this problem, neural network has been used to capture the parameter uncertainty on line. The choice of basis function plays the major role in capturing the unknown dynamics. Here in this paper, many basis function has been studied for approximation of unknown dynamics. Cosine basis function has yield the best response compared to any other basis function for capturing the unknown dynamics. Neural network with Cosine basis function has improved the autopilot performance as well as robustness compared to Dynamic inversion without Neural network.
Resumo:
The GW approximation to the electron self-energy has become a standard method for ab initio calculation of excited-state properties of condensed-matter systems. In many calculations, the G W self-energy operator, E, is taken to be diagonal in the density functional theory (DFT) Kohn-Sham basis within the G0 W0 scheme. However, there are known situations in which this diagonal Go Wo approximation starting from DFT is inadequate. We present two schemes to resolve such problems. The first, which we called sc-COHSEX-PG W, involves construction of an improved mean field using the static limit of GW, known as COHSEX (Coulomb hole and screened exchange), which is significantly simpler to treat than GW W. In this scheme, frequency-dependent self energy E(N), is constructed and taken to be diagonal in the COHSEX orbitals after the system is solved self-consistently within this formalism. The second method is called off diagonal-COHSEX G W (od-COHSEX-PG W). In this method, one does not self-consistently change the mean-field starting point but diagonalizes the COHSEX Hamiltonian within the Kohn-Sham basis to obtain quasiparticle wave functions and uses the resulting orbitals to construct the G W E in the diagonal form. We apply both methods to a molecular system, silane, and to two bulk systems, Si and Ge under pressure. For silane, both methods give good quasiparticle wave functions and energies. Both methods give good band gaps for bulk silicon and maintain good agreement with experiment. Further, the sc-COHSEX-PGW method solves the qualitatively incorrect DFT mean-field starting point (having a band overlap) in bulk Ge under pressure.
Resumo:
We develop a new method to study the thermalization of time dependent retarded Green function in conformal field theories holographically dual to thin shell AdS Vaidya space times. The method relies on using the information of all time derivatives of the Green function at the shell and then evolving it for later times. The time derivatives of the Green function at the shell is given in terms of a recursion formula. Using this method we obtain analytic results for short time thermalization of the Green function. We show that the late time behaviour of the Green function is determined by the first quasinormal mode. We then implement the method numerically. As applications of this method we study the thermalization of the retarded time dependent Green function corresponding to a minimally coupled scalar in the AdS 3 and AdS 5 thin Vaidya shells. We see that as expected the late time behaviour is determined by the first quasinormal mode. We apply the method to study the late time behaviour of the shear vector mode in AdS 5 Vaidya shell. At small momentum the corresponding time dependent Green function is expected to relax to equilibrium by the shear hydrodynamic mode. Using this we obtain the universal ratio of the shear viscosity to entropy density from a time dependent process.
Resumo:
Human provisioning of wildlife with food is a widespread global practice that occurs in multiple socio-cultural circumstances. Provisioning may indirectly alter ecosystem functioning through changes in the eco-ethology of animals, but few studies have quantified this aspect. Provisioning of primates by humans is known to impact their activity budgets, diets and ranging patterns. Primates are also keystone species in tropical forests through their role as seed dispersers; yet there is no information on how provisioning might affect primate ecological functions. The rhesus macaque is a major human-commensal species but is also an important seed disperser in the wild. In this study, we investigated the potential impacts of provisioning on the role of rhesus macaques as seed dispersers in the Buxa Tiger Reserve, India. We studied a troop of macaques which were provisioned for a part of the year and were dependent on natural resources for the rest. We observed feeding behaviour, seed handling techniques and ranging patterns of the macaques and monitored availability of wild fruits. Irrespective of fruit availability, frugivory and seed dispersal activities decreased when the macaques were provisioned. Provisioned macaques also had shortened daily ranges implying shorter dispersal distances. Finally, during provisioning periods, seeds were deposited on tarmac roads that were unconducive for germination. Provisioning promotes human-primate conflict, as commensal primates are often involved in aggressive encounters with humans over resources, leading to negative consequences for both parties involved. Preventing or curbing provisioning is not an easy task as feeding wild animals is a socio-cultural tradition across much of South and South-East Asia, including India. We recommend the initiation of literacy programmes that educate lay citizens about the ill-effects of provisioning and strongly caution them against the practice.
Resumo:
Background: Gene expression technologies have opened up new ways to diagnose and treat cancer and other diseases. Clustering algorithms are a useful approach with which to analyze genome expression data. They attempt to partition the genes into groups exhibiting similar patterns of variation in expression level. An important problem associated with gene classification is to discern whether the clustering process can find a relevant partition as well as the identification of new genes classes. There are two key aspects to classification: the estimation of the number of clusters, and the decision as to whether a new unit (gene, tumor sample ... ) belongs to one of these previously identified clusters or to a new group. Results: ICGE is a user-friendly R package which provides many functions related to this problem: identify the number of clusters using mixed variables, usually found by applied biomedical researchers; detect whether the data have a cluster structure; identify whether a new unit belongs to one of the pre-identified clusters or to a novel group, and classify new units into the corresponding cluster. The functions in the ICGE package are accompanied by help files and easy examples to facilitate its use. Conclusions: We demonstrate the utility of ICGE by analyzing simulated and real data sets. The results show that ICGE could be very useful to a broad research community.
Resumo:
Data were taken in 1979-80 by the CCFRR high energy neutrino experiment at Fermilab. A total of 150,000 neutrino and 23,000 antineutrino charged current events in the approximate energy range 25 < E_v < 250GeV are measured and analyzed. The structure functions F2 and xF_3 are extracted for three assumptions about σ_L/σ_T:R=0., R=0.1 and R= a QCD based expression. Systematic errors are estimated and their significance is discussed. Comparisons or the X and Q^2 behaviour or the structure functions with results from other experiments are made.
We find that statistical errors currently dominate our knowledge of the valence quark distribution, which is studied in this thesis. xF_3 from different experiments has, within errors and apart from level differences, the same dependence on x and Q^2, except for the HPWF results. The CDHS F_2 shows a clear fall-off at low-x from the CCFRR and EMC results, again apart from level differences which are calculable from cross-sections.
The result for the the GLS rule is found to be 2.83±.15±.09±.10 where the first error is statistical, the second is an overall level error and the third covers the rest of the systematic errors. QCD studies of xF_3 to leading and second order have been done. The QCD evolution of xF_3, which is independent of R and the strange sea, does not depend on the gluon distribution and fits yield
ʌ_(LO) = 88^(+163)_(-78) ^(+113)_(-70) MeV
The systematic errors are smaller than the statistical errors. Second order fits give somewhat different values of ʌ, although α_s (at Q^2_0 = 12.6 GeV^2) is not so different.
A fit using the better determined F_2 in place of xF_3 for x > 0.4 i.e., assuming q = 0 in that region, gives
ʌ_(LO) = 266^(+114)_(-104) ^(+85)_(-79) MeV
Again, the statistical errors are larger than the systematic errors. An attempt to measure R was made and the measurements are described. Utilizing the inequality q(x)≥0 we find that in the region x > .4 R is less than 0.55 at the 90% confidence level.
Resumo:
A locally integrable function is said to be of vanishing mean oscillation (VMO) if its mean oscillation over cubes in Rd converges to zero with the volume of the cubes. We establish necessary and sufficient conditions for a locally integrable function defined on a bounded measurable set of positive measure to be the restriction to that set of a VMO function.
We consider the similar extension problem pertaining to BMO(ρ) functions; that is, those VMO functions whose mean oscillation over any cube is O(ρ(l(Q))) where l(Q) is the length of Q and ρ is a positive, non-decreasing function with ρ(0+) = 0.
We apply these results to obtain sufficient conditions for a Blaschke sequence to be the zeros of an analytic BMO(ρ) function on the unit disc.
Resumo:
176 p.
Resumo:
Os recentes avanços tecnológicos fizeram aumentar o nível de qualificação do pesquisador em epidemiologia. A importância do papel estratégico da educação não pode ser ignorada. Todavia, a Associação Brasileira de Pós-graduação em Saúde Coletiva (ABRASCO), no seu último plano diretor (2005-2009), aponta uma pequena valorização na produção de material didático-pedagógico e, ainda, a falta de uma política de desenvolvimento e utilização de software livre no ensino da epidemiologia. É oportuno, portanto, investir em uma perspectiva relacional, na linha do que a corrente construtivista propõe, uma vez que esta teoria tem sido reconhecida como a mais adequada no desenvolvimento de materiais didáticos informatizados. Neste sentido, promover cursos interativos e, no bojo destes, desenvolver material didático conexo é oportuno e profícuo. No âmbito da questão política de desenvolvimento e utilização de software livre no ensino da epidemiologia, particularmente em estatística aplicada, o R tem se mostrado um software de interesse emergente. Ademais, não só porque evita possíveis penalizações por utilização de software comercial sem licença, mas também porque o franco acesso aos códigos e programação o torna uma ferramenta excelente para a elaboração de material didático em forma de hiperdocumentos, importantes alicerces para uma tão desejada interação docentediscente em sala de aula. O principal objetivo é desenvolver material didático em R para os cursos de bioestatística aplicada à análise epidemiológica. Devido a não implementação de certas funções estatísticas no R, também foi incluída a programação de funções adicionais. Os cursos empregados no desenvolvimento desse material fundamentaram-se nas disciplinas Uma introdução à Plataforma R para Modelagem Estatística de Dados e Instrumento de Aferição em Epidemiologia I: Teoria Clássica de Medidas (Análise) vinculadas ao departamento de Epidemiologia, Instituto de Medicina Social (IMS) da Universidade do Estado do Rio de Janeiro (UERJ). A base teórico-pedagógica foi definida a partir dos princípios construtivistas, na qual o indivíduo é agente ativo e crítico de seu próprio conhecimento, construindo significados a partir de experiências próprias. E, à ótica construtivista, seguiu-se a metodologia de ensino da problematização, abrangendo problemas oriundos de situações reais e sistematizados por escrito. Já os métodos computacionais foram baseados nas Novas Tecnologias da Informação e Comunicação (NTIC). As NTICs exploram a busca pela consolidação de currículos mais flexíveis, adaptados às características diferenciadas de aprendizagem dos alunos. A implementação das NTICs foi feita através de hipertexto, que é uma estrutura de textos interligados por nós ou vínculos (links), formando uma rede de informações relacionadas. Durante a concepção do material didático, foram realizadas mudanças na interface básica do sistema de ajuda do R para garantir a interatividade aluno-material. O próprio instrutivo é composto por blocos, que incentivam a discussão e a troca de informações entre professor e alunos.
Resumo:
There are around 27 species of Amolops amphibian distributed in South-east of Asia. Seven antimicrobial peptides (AMPs) belonging to two different families were purified from skin of rufous-spotted torrent frog, Amolops loloensis, and designated brevinins
Resumo:
Bombesin-like peptides (BLPs) are a family of neuroendocrinic peptides that mediate a variety of biological activities. Three mature BLPs from the skin secretions of the frog Odorrana grahami were purified. Several bombesin-like peptide cDNA sequences enc
Resumo:
Kolmogorov's two-thirds, ((Δv) 2) ∼ e 2/ 3r 2/ 3, and five-thirds, E ∼ e 2/ 3k -5/ 3, laws are formally equivalent in the limit of vanishing viscosity, v → 0. However, for most Reynolds numbers encountered in laboratory scale experiments, or numerical simulations, it is invariably easier to observe the five-thirds law. By creating artificial fields of isotropic turbulence composed of a random sea of Gaussian eddies whose size and energy distribution can be controlled, we show why this is the case. The energy of eddies of scale, s, is shown to vary as s 2/ 3, in accordance with Kolmogorov's 1941 law, and we vary the range of scales, γ = s max/s min, in any one realisation from γ = 25 to γ = 800. This is equivalent to varying the Reynolds number in an experiment from R λ = 60 to R λ = 600. While there is some evidence of a five-thirds law for g > 50 (R λ > 100), the two-thirds law only starts to become apparent when g approaches 200 (R λ ∼ 240). The reason for this discrepancy is that the second-order structure function is a poor filter, mixing information about energy and enstrophy, and from scales larger and smaller than r. In particular, in the inertial range, ((Δv) 2) takes the form of a mixed power-law, a 1+a 2r 2+a 3r 2/ 3, where a 2r 2 tracks the variation in enstrophy and a 3r 2/ 3 the variation in energy. These findings are shown to be consistent with experimental data where the polution of the r 2/ 3 law by the enstrophy contribution, a 2r 2, is clearly evident. We show that higherorder structure functions (of even order) suffer from a similar deficiency.