37 resultados para The issue of autonomy


Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is a policy of Solid State Communications’ Executive Editorial Board to organize special issues from time to time on topics of current interests. The present issue focuses on soft condensed matter, a rapidly developing and diverse area of importance not only for the basic science, but also for its potential applications. The ten articles in this issue are intended to give the readers a snapshot of some latest developments in soft condensed matter, mainly from the point of view of basic science. As the special issues are intended for a broad audience, most articles are short reviews that introduce the readers to the relevant topics. Hence this special issue can be especially helpful to readers who might not be specialists in this area but would like to have a quick grasp on some of the interesting research directions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The method proposed here considers the mean flow in the transition zone as a linear combination of the laminar and turbulent boundary layer in proportions determined by the transitional intermittency, the component flows being calculated by approximate integral methods. The intermittency distribution adopted takes into account the possibility of subtransitions within the zone in the presence of strong pressure gradients. A new nondimensional spot formation rate, whose value depends on the pressure gradient, is utilized to estimate the extent of the transition zone. Onset location is determined by a correlation that takes into account freestream turbulence and facility-specific residual disturbances in test data. Extensive comparisons with available experimental results in strong pressure gradients show that the proposed method performs at least as well as differential models, in many cases better, and is always faster.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The modes of binding of Gp(2',5')A, Gp(2',5')C, Gp(2',5')G and Gp(2',5')U to RNase T1 have been determined by computer modelling studies. All these dinucleoside phosphates assume extended conformations in the active site leading to better interactions with the enzyme. The 5'-terminal guanine of all these ligands is placed in the primary base binding site of the enzyme in an orientation similar to that of 2'-GMP in the RNase T1-2'-GMP complex. The 2'-terminal purines are placed close to the hydrophobic pocket formed by the residues Gly71, Ser72, Pro73 and Gly74 which occur in a loop region. However, the orientation of the 2'-terminal pyrimidines is different from that of 2'-terminal purines. This perhaps explains the higher binding affinity of the 2',5'-linked guanine dinucleoside phosphates with 2'-terminal purines than those with 2'-terminal pyrimidines. A comparison of the binding of the guanine dinucleoside phosphates with 2',5'- and 3',5'-linkages suggests significant differences in the ribose pucker and hydrogen bonding interactions between the catalytic residues and the bound nucleoside phosphate implying that 2',5'-linked dinucleoside phosphates may not be the ideal ligands to probe the role of the catalytic amino acid residues. A change in the amino acid sequence in the surface loop region formed by the residues Gly71 to Gly74 drastically affects the conformation of the base binding subsite, and this may account for the inactivity of the enzyme with altered sequence i.e., with Pro, Gly and Ser at positions 71 to 73 respectively. These results thus suggest that in addition to recognition and catalytic sites, interactions at the loop regions which constitute the subsite for base binding are also crucial in determining the substrate specificity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analysis of gas-particle nozzle flow is carried out with attention to the effect of dust particles on the vibrational relaxation phenomena and consequent effects on the gain of a gasdynamic laser. The phase nonequilibrium between the gas mixture and the particles during the nozzle expansion process is taken into account simultaneously. The governing equations of the two-phase nozzle flow have been transformed into similar form, and general correlating parameters have been obtained. It is shown from the present analysis that the particles present in the mixture affect the optimum gain obtainable from a gasdynamic laser adversely, and the effect depends on the size and loading of the particles in the mixture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Indian logic has a long history. It somewhat covers the domains of two of the six schools (darsanas) of Indian philosophy, namely, Nyaya and Vaisesika. The generally accepted definition of Indian logic over the ages is the science which ascertains valid knowledge either by means of six senses or by means of the five members of the syllogism. In other words, perception and inference constitute the subject matter of logic. The science of logic evolved in India through three ages: the ancient, the medieval and the modern, spanning almost thirty centuries. Advances in Computer Science, in particular, in Artificial Intelligence have got researchers in these areas interested in the basic problems of language, logic and cognition in the past three decades. In the 1980s, Artificial Intelligence has evolved into knowledge-based and intelligent system design, and the knowledge base and inference engine have become standard subsystems of an intelligent system. One of the important issues in the design of such systems is knowledge acquisition from humans who are experts in a branch of learning (such as medicine or law) and transferring that knowledge to a computing system. The second important issue in such systems is the validation of the knowledge base of the system i.e. ensuring that the knowledge is complete and consistent. It is in this context that comparative study of Indian logic with recent theories of logic, language and knowledge engineering will help the computer scientist understand the deeper implications of the terms and concepts he is currently using and attempting to develop.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bacteriorhodopsin has been the subject of intense study in order to understand its photochemical function. The recent atomic model proposed by Henderson and coworkers based on electron cryo-microscopic studies has helped in understanding many of the structural and functional aspects of bacteriorhodopsin. However, the accuracy of the positions of the side chains is not very high since the model is based on low-resolution data. In this study, we have minimized the energy of this structure of bacteriorhodopsin and analyzed various types of interactions such as - intrahelical and interhelical hydrogen bonds and retinal environment. In order to understand the photochemical action, it is necessary to obtain information on the structures adopted at the intermediate states. In this direction, we have generated some intermediate structures taking into account certain experimental data, by computer modeling studies. Various isomers of retinal with 13-cis and/or 15-cis conformations and all possible staggered orientations of Lys-216 side chain were generated. The resultant structures were examined for the distance between Lys-216-schiff base nitrogen and the carboxylate oxygen atoms of Asp-96 - a residue which is known to reprotonate the schiff base at later stages of photocycle. Some of the structures were selected on the basis of suitable retinal orientation and the stability of these structures were tested by energy minimization studies. Further, the minimized structures are analyzed for the hydrogen bond interactions and retinal environment and the results are compared with those of the minimized rest state structure. The importance of functional groups in stabilizing the structure of bacteriorhodopsin and in participating dynamically during the photocycle have been discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present an extensive study on magnetic and transport properties of La(0.85)Sr(0.15)CoO(3) single crystals grown by a float zone method to address the issue of phase separation versus spin-glass (SG) behavior. The dc magnetization study reveals a kink in field-cooled magnetization, and the peak in the zero-field-cooling curve shifts to lower temperature at modest dc fields, indicating the SG magnetic phase. The ac susceptibility study exhibits a considerable frequency-dependent peak shift (similar to 4 K) and a time-dependent memory effect below the freezing temperature. In addition, the characteristic time scale tau(0) estimated from the frequency-dependent ac susceptibility measurement is found to be similar to 10(-13) s, which matches well with typical values observed in canonical SG systems. The transport relaxation study evidently demonstrates the time-dependent glassy phenomena. In essence, all our experimental results corroborate the existence of SG behavior in La(0.85)Sr(0.15)CoO(3) single crystals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A "plan diagram" is a pictorial enumeration of the execution plan choices of a database query optimizer over the relational selectivity space. We have shown recently that, for industrial-strength database engines, these diagrams are often remarkably complex and dense, with a large number of plans covering the space. However, they can often be reduced to much simpler pictures, featuring significantly fewer plans, without materially affecting the query processing quality. Plan reduction has useful implications for the design and usage of query optimizers, including quantifying redundancy in the plan search space, enhancing useability of parametric query optimization, identifying error-resistant and least-expected-cost plans, and minimizing the overheads of multi-plan approaches. We investigate here the plan reduction issue from theoretical, statistical and empirical perspectives. Our analysis shows that optimal plan reduction, w.r.t. minimizing the number of plans, is an NP-hard problem in general, and remains so even for a storage-constrained variant. We then present a greedy reduction algorithm with tight and optimal performance guarantees, whose complexity scales linearly with the number of plans in the diagram for a given resolution. Next, we devise fast estimators for locating the best tradeoff between the reduction in plan cardinality and the impact on query processing quality. Finally, extensive experimentation with a suite of multi-dimensional TPCH-based query templates on industrial-strength optimizers demonstrates that complex plan diagrams easily reduce to "anorexic" (small absolute number of plans) levels incurring only marginal increases in the estimated query processing costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biochemical pathways involving chemical kinetics in medium concentrations (i.e., at mesoscale) of the reacting molecules can be approximated as chemical Langevin equations (CLE) systems. We address the physically consistent non-negative simulation of the CLE sample paths as well as the issue of non-Lipschitz diffusion coefficients when a species approaches depletion and any stiffness due to faster reactions. The non-negative Fully Implicit Stochastic alpha (FIS alpha) method in which stopped reaction channels due to depleted reactants are deleted until a reactant concentration rises again, for non-negativity preservation and in which a positive definite Jacobian is maintained to deal with possible stiffness, is proposed and analysed. The method is illustrated with the computation of active Protein Kinase C response in the Protein Kinase C pathway. (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article compares the land use in solar energy technologies with conventional energy sources. This has been done by introducing two parameters called land transformation and land occupation. It has been shown that the land area transformed by solar energy power generation is small compared to hydroelectric power generation, and is comparable with coal and nuclear energy power generation when life-cycle transformations are considered. We estimate that 0.97% of total land area or 3.1% of the total uncultivable land area of India would be required to generate 3400 TWh/yr from solar energy power systems in conjunction with other renewable energy sources.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We revisit the issue of considering stochasticity of Grassmannian coordinates in N = 1 superspace, which was analyzed previously by Kobakhidze et al. In this stochastic supersymmetry (SUSY) framework, the soft SUSY breaking terms of the minimal supersymmetric Standard Model (MSSM) such as the bilinear Higgs mixing, trilinear coupling, as well as the gaugino mass parameters are all proportional to a single mass parameter xi, a measure of supersymmetry breaking arising out of stochasticity. While a nonvanishing trilinear coupling at the high scale is a natural outcome of the framework, a favorable signature for obtaining the lighter Higgs boson mass m(h) at 125 GeV, the model produces tachyonic sleptons or staus turning to be too light. The previous analyses took Lambda, the scale at which input parameters are given, to be larger than the gauge coupling unification scale M-G in order to generate acceptable scalar masses radiatively at the electroweak scale. Still, this was inadequate for obtaining m(h) at 125 GeV. We find that Higgs at 125 GeV is highly achievable, provided we are ready to accommodate a nonvanishing scalar mass soft SUSY breaking term similar to what is done in minimal anomaly mediated SUSY breaking (AMSB) in contrast to a pure AMSB setup. Thus, the model can easily accommodate Higgs data, LHC limits of squark masses, WMAP data for dark matter relic density, flavor physics constraints, and XENON100 data. In contrast to the previous analyses, we consider Lambda = M-G, thus avoiding any ambiguities of a post-grand unified theory physics. The idea of stochastic superspace can easily be generalized to various scenarios beyond the MSSM. DOI: 10.1103/PhysRevD.87.035022

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The last decade has witnessed two unusually large tsunamigenic earthquakes. The devastation from the 2004 Sumatra Andaman and the 2011 Tohoku-Oki earthquakes (both of moment magnitude >= 9.0) and their ensuing tsunamis comes as a harsh reminder on the need to assess and mitigate coastal hazards due to earthquakes and tsunamis worldwide. Along any given subduction zone, megathrust tsunamigenic earthquakes occur over intervals considerably longer than their documented histories and thus, 2004-type events may appear totally `out of the blue'. In order to understand and assess the risk from tsunamis, we need to know their long-term frequency and magnitude, going beyond documented history, to recent geological records. The ability to do this depends on our knowledge of the processes that govern subduction zones, their responses to interseismic and coseismic deformation, and on our expertise to identify and relate tsunami deposits to earthquake sources. In this article, we review the current state of understanding on the recurrence of great thrust earthquakes along global subduction zones.