924 resultados para Critical value


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In order to quantify quantum entanglement in two-impurity Kondo systems, we calculate the concurrence, negativity, and von Neumann entropy. The entanglement of the two Kondo impurities is shown to be determined by two competing many-body effects, namely the Kondo effect and the Ruderman-Kittel-Kasuya-Yosida (RKKY) interaction, I. Due to the spin-rotational invariance of the ground state, the concurrence and negativity are uniquely determined by the spin-spin correlation between the impurities. It is found that there exists a critical minimum value of the antiferromagnetic correlation between the impurity spins which is necessary for entanglement of the two impurity spins. The critical value is discussed in relation with the unstable fixed point in the two-impurity Kondo problem. Specifically, at the fixed point there is no entanglement between the impurity spins. Entanglement will only be created [and quantum information processing (QIP) will only be possible] if the RKKY interaction exchange energy, I, is at least several times larger than the Kondo temperature, T-K. Quantitative criteria for QIP are given in terms of the impurity spin-spin correlation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A presente pesquisa tem como objetivo principal demonstrar a contribuição do conceito de sujeito em Franz Hinkelammert para o estudo da religião. Pretende-se mostrar o valor epistemológico crítico desse conceito, compreensível à luz do método dialético transcendental descoberto por Marx e desenvolvido por Hinkelammert, possibilitando sua aplicabilidade no estudo das ciências da religião. Procura-se responder à pergunta posta por Boaventura de Sousa Santos sobre a possibilidade de valorizar o potencial emancipador das subjetividades rebeldes, visando a superação da concepção abstrata de sujeito das ciências empíricas, cuja metodologia científica se fundamenta na objetividade neutral de cunho weberiano. Para tanto, analisa-se a relação entre essa concepção e os sacrifícios humanos dai decorrentes. A invisibilidade ou resignada aceitação desses sacrifícios apontam para a necessidade epistemológica da adoção do conceito de sujeito como critério científico de análise e discernimento, levando à descoberta e crítica das dinâmicas relacionais inconscientes que regem as sociedades entregues à inércia de suas estruturas. Trata-se dum conceito que implica numa teologia subjetiva na qual, Deus se faz presente como cúmplice da resistência das vítimas contra os dominadores , bem como dum critério não religioso que desemboca numa ética autônoma, voltada para uma práxis religiosa humanizadora.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We study a variation of the graph coloring problem on random graphs of finite average connectivity. Given the number of colors, we aim to maximize the number of different colors at neighboring vertices (i.e. one edge distance) of any vertex. Two efficient algorithms, belief propagation and Walksat are adapted to carry out this task. We present experimental results based on two types of random graphs for different system sizes and identify the critical value of the connectivity for the algorithms to find a perfect solution. The problem and the suggested algorithms have practical relevance since various applications, such as distributed storage, can be mapped onto this problem.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The flash-pattern evoked potential difference (F - P) in man increases with age (93 subjects), correlates with decreasing cognitive ability and when it exceeds a unique critical level the subject is clinically diagnosed as having Alzheimer's disease. Aluminium accumulates in the human brain with age, increases the F - P value close to the critical value in a dose dependent manner, and at such a rate that normal environmental exposure to aluminium accounts for all or nearly all the F - P increases in man. Aluminium neurotoxicity is therefore a major cause of sporadic Alzheimer's disease.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The problem of learning by examples in ultrametric committee machines (UCMs) is studied within the framework of statistical mechanics. Using the replica formalism we calculate the average generalization error in UCMs with L hidden layers and for a large enough number of units. In most of the regimes studied we find that the generalization error, as a function of the number of examples presented, develops a discontinuous drop at a critical value of the load parameter. We also find that when L>1 a number of teacher networks with the same number of hidden layers and different overlaps induce learning processes with the same critical points.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Bistability and hysteresis of magnetohydrodynamic dipolar dynamos generated by turbulent convection in rotating spherical fluid shells is demonstrated. Hysteresis appears as a transition between two distinct regimes of dipolar dynamos with rather different properties including a pronounced difference in the amplitude of the axisymmetric poloidal field component and in the form of the differential rotation. The bistability occurs from the onset of dynamo action up to about 9 times the critical value of the Rayleigh number for onset of convection and over a wide range of values of the ordinary and the magnetic Prandtl numbers including the value unity. Copyright © EPLA, 2009.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present a data based statistical study on the effects of seasonal variations in the growth rates of the gastro-intestinal (GI) parasitic infection in livestock. The alluded growth rate is estimated through the variation in the number of eggs per gram (EPG) of faeces in animals. In accordance with earlier studies, our analysis too shows that rainfall is the dominant variable in determining EPG infection rates compared to other macro-parameters like temperature and humidity. Our statistical analysis clearly indicates an oscillatory dependence of EPG levels on rainfall fluctuations. Monsoon recorded the highest infection with a comparative increase of at least 2.5 times compared to the next most infected period (summer). A least square fit of the EPG versus rainfall data indicates an approach towards a super diffusive (i. e. root mean square displacement growing faster than the square root of the elapsed time as obtained for simple diffusion) infection growth pattern regime for low rainfall regimes (technically defined as zeroth level dependence) that gets remarkably augmented for large rainfall zones. Our analysis further indicates that for low fluctuations in temperature (true on the bulk data), EPG level saturates beyond a critical value of the rainfall, a threshold that is expected to indicate the onset of the nonlinear regime. The probability density functions (PDFs) of the EPG data show oscillatory behavior in the large rainfall regime (greater than 500 mm), the frequency of oscillation, once again, being determined by the ambient wetness (rainfall, and humidity). Data recorded over three pilot projects spanning three measures of rainfall and humidity bear testimony to the universality of this statistical argument. © 2013 Chattopadhyay and Bandyopadhyay.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Using a modified deprivation (or poverty) function, in this paper, we theoretically study the changes in poverty with respect to the 'global' mean and variance of the income distribution using Indian survey data. We show that when the income obeys a log-normal distribution, a rising mean income generally indicates a reduction in poverty while an increase in the variance of the income distribution increases poverty. This altruistic view for a developing economy, however, is not tenable anymore once the poverty index is found to follow a pareto distribution. Here although a rising mean income indicates a reduction in poverty, due to the presence of an inflexion point in the poverty function, there is a critical value of the variance below which poverty decreases with increasing variance while beyond this value, poverty undergoes a steep increase followed by a decrease with respect to higher variance. Identifying this inflexion point as the poverty line, we show that the pareto poverty function satisfies all three standard axioms of a poverty index [N.C. Kakwani, Econometrica 43 (1980) 437; A.K. Sen, Econometrica 44 (1976) 219] whereas the log-normal distribution falls short of this requisite. Following these results, we make quantitative predictions to correlate a developing with a developed economy. © 2006 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Potential applications of high-damping and high-stiffness composites have motivated extensive research on the effects of negative-stiffness inclusions on the overall properties of composites. Recent theoretical advances have been based on the Hashin-Shtrikman composite models, one-dimensional discrete viscoelastic systems and a two-dimensional nested triangular viscoelastic network. In this paper, we further analyze the two-dimensional triangular structure containing pre-selected negative-stiffness components to study its underlying deformation mechanisms and stability. Major new findings are structure-deformation evolution with respect to the magnitude of negative stiffness under shear loading and the phenomena related to dissipation-induced destabilization and inertia-induced stabilization, according to Lyapunov stability analysis. The evolution shows strong correlations between stiffness anomalies and deformation modes. Our stability results reveal that stable damping peaks, i.e. stably extreme effective damping properties, are achievable under hydrostatic loading when the inertia is greater than a critical value. Moreover, destabilization induced by elemental damping is observed with the critical inertia. Regardless of elemental damping, when the inertia is less than the critical value, a weaker system instability is identified.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Interactions between the wakes in a flow past a row of square bars are investigated by numerical simulations, the linear stability analysis and the bifurcation analysis. It is assumed that the row of square bars is placed across a uniform flow. Two-dimensional and incompressible flow field is also assumed. The flow is steady and symmetric along a streamwise centerline through the center of each square bar at low Reynolds numbers. However, it becomes unsteady and periodic in time at the Reynolds numbers larger than a critical value, and then the wakes behind the square bars become oscillatory. It is found by numerical simulations that vortices are shed synchronously from every couple of adjacent square bars in the same phase or in the anti-phase depending upon the distance between the bars. The synchronous shedding of vortices is clarified to occur due to an instability of the steady symmetric flow by the linear stability analysis. The bifurcation diagram of the flow is obtained and the critical Reynolds number of the instability is evaluated numerically.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The so-called "internal modes" localized near the domain boundaries in quasi-two dimensional antiferromagnets are investigated. The possible localized states are classified and their frequency dependences on the system discreteness parameter λ=J/β, which describes the ratio of the magnitudes of the exchange interplane interaction and the magnetic anisotropy, are found. A sudden change in the spectrum of the local internal modes is observed at a critical value of this parameter, λ=λb=3/4, where the domain wall shifts from a collinear to a canted shape. When λ<λb there are one symmetric and two antisymmetric local modes, and when λ>λb the modes are two symmetric, one antisymmetric, and one shear. For discreteness parameters close to the critical value, the frequencies of some of the local modes lie deep inside the gap for the linear AFM magnon spectrum and can be observed experimentally. © 2010 American Institute of Physics.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this study was to test Lotka’s law of scientific publication productivity using the methodology outlined by Pao (1985), in the field of Library and Information Studies (LIS). Lotka’s law has been sporadically tested in the field over the past 30+ years, but the results of these studies are inconclusive due to the varying methods employed by the researchers. ^ A data set of 1,856 citations that were found using the ISI Web of Knowledge databases were studied. The values of n and c were calculated to be 2.1 and 0.6418 (64.18%) respectively. The Kolmogorov-Smirnov (K-S) one sample goodness-of-fit test was conducted at the 0.10 level of significance. The Dmax value is 0.022758 and the calculated critical value is 0.026562. It was determined that the null hypothesis stating that there is no difference in the observed distribution of publications and the distribution obtained using Lotka’s and Pao’s procedure could not be rejected. ^ This study finds that literature in the field of Library and Information Studies does conform to Lotka’s law with reliable results. As result, Lotka’s law can be used in LIS as a standardized means of measuring author publication productivity which will lead to findings that are comparable on many levels (e.g., department, institution, national). Lotka’s law can be employed as an empirically proven analytical tool to establish publication productivity benchmarks for faculty and faculty librarians. Recommendations for further study include (a) exploring the characteristics of the high and low producers; (b) finding a way to successfully account for collaborative contributions in the formula; and, (c) a detailed study of institutional policies concerning publication productivity and its impact on the appointment, tenure and promotion process of academic librarians. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this study was to test Lotka’s law of scientific publication productivity using the methodology outlined by Pao (1985), in the field of Library and Information Studies (LIS). Lotka’s law has been sporadically tested in the field over the past 30+ years, but the results of these studies are inconclusive due to the varying methods employed by the researchers. A data set of 1,856 citations that were found using the ISI Web of Knowledge databases were studied. The values of n and c were calculated to be 2.1 and 0.6418 (64.18%) respectively. The Kolmogorov-Smirnov (K-S) one sample goodness-of-fit test was conducted at the 0.10 level of significance. The Dmax value is 0.022758 and the calculated critical value is 0.026562. It was determined that the null hypothesis stating that there is no difference in the observed distribution of publications and the distribution obtained using Lotka’s and Pao’s procedure could not be rejected. This study finds that literature in the field of library and Information Studies does conform to Lotka’s law with reliable results. As result, Lotka’s law can be used in LIS as a standardized means of measuring author publication productivity which will lead to findings that are comparable on many levels (e.g., department, institution, national). Lotka’s law can be employed as an empirically proven analytical tool to establish publication productivity benchmarks for faculty and faculty librarians. Recommendations for further study include (a) exploring the characteristics of the high and low producers; (b) finding a way to successfully account for collaborative contributions in the formula; and, (c) a detailed study of institutional policies concerning publication productivity and its impact on the appointment, tenure and promotion process of academic librarians.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In 1949, P. W. Forsbergh Jr. reported spontaneous spatial ordering in the birefringence patterns seen in flux-grown BaTiO3 crystals [1], under the transmission polarized light microscope [2]. Stunningly regular square-net arrays were often only found within a finite temperature window and could be induced on both heating and cooling, suggesting genuine thermodynamic stability. At the time, Forsbergh rationalized the patterns to have resulted from the impingement of ferroelastic domains, creating a complex tessellation of variously shaped domain packets. However, evidence for the intricate microstructural arrangement proposed by Forsbergh has never been found. Moreover, no robust thermodynamic argument has been presented to explain the region of thermal stability, its occurrence just below the Curie Temperature and the apparent increase in entropy associated with the loss of the Forsbergh pattern on cooling. As a result, despite decades of research on ferroelectrics, this ordering phenomenon and its thermodynamic origin have remained a mystery. In this paper, we re-examine the microstructure of flux-grown BaTiO3 crystals, which show Forsbergh birefringence patterns. Given an absence of any obvious arrays of domain polyhedra, or even regular shapes of domain packets, we suggest an alternative origin for the Forsbergh pattern, in which sheets of orthogonally oriented ferroelastic stripe domains simply overlay one another. We show explicitly that the Forsbergh birefringence pattern occurs if the periodicity of the stripe domains is above a critical value. Moreover, by considering well-established semiempirical models, we show that the significant domain coarsening needed to generate the Forsbergh birefringence is fully expected in a finite window below the Curie Temperature. We hence present a much more straightforward rationalization of the Forsbergh pattern than that originally proposed, in which exotic thermodynamic arguments are unnecessary.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Understanding the effect of electric fields on the physical and chemical properties of two-dimensional (2D) nanostructures is instrumental in the design of novel electronic and optoelectronic devices. Several of those properties are characterized in terms of the dielectric constant which play an important role on capacitance, conductivity, screening, dielectric losses and refractive index. Here we review our recent theoretical studies using density functional calculations including van der Waals interactions on two types of layered materials of similar two-dimensional molecular geometry but remarkably different electronic structures, that is, graphene and molybdenum disulphide (MoS2). We focus on such two-dimensional crystals because of they complementary physical and chemical properties, and the appealing interest to incorporate them in the next generation of electronic and optoelectronic devices. We predict that the effective dielectric constant (ε) of few-layer graphene and MoS2 is tunable by external electric fields (E ext). We show that at low fields (E ext < 0.01 V/Å) ε assumes a nearly constant value ∼4 for both materials, but increases at higher fields to values that depend on the layer thickness. The thicker the structure the stronger is the modulation of ε with the electric field. Increasing of the external field perpendicular to the layer surface above a critical value can drive the systems to an unstable state where the layers are weakly coupled and can be easily separated. The observed dependence of ε on the external field is due to charge polarization driven by the bias, which show several similar characteristics despite of the layer considered. All these results provide key information about control and understanding of the screening properties in two-dimensional crystals beyond graphene and MoS2