952 resultados para Analytic number theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Researchers suggest that personalization on the Semantic Web adds up to a Web 3.0 eventually. In this Web, personalized agents process and thus generate the biggest share of information rather than humans. In the sense of emergent semantics, which supplements traditional formal semantics of the Semantic Web, this is well conceivable. An emergent Semantic Web underlying fuzzy grassroots ontology can be accomplished through inducing knowledge from users' common parlance in mutual Web 2.0 interactions [1]. These ontologies can also be matched against existing Semantic Web ontologies, to create comprehensive top-level ontologies. On the Web, if augmented with information in the form of restrictions andassociated reliability (Z-numbers) [2], this collection of fuzzy ontologies constitutes an important basis for an implementation of Zadeh's restriction-centered theory of reasoning and computation (RRC) [3]. By considering real world's fuzziness, RRC differs from traditional approaches because it can handle restrictions described in natural language. A restriction is an answer to a question of the value of a variable such as the duration of an appointment. In addition to mathematically well-defined answers, RRC can likewise deal with unprecisiated answers as "about one hour." Inspired by mental functions, it constitutes an important basis to leverage present-day Web efforts to a natural Web 3.0. Based on natural language information, RRC may be accomplished with Z-number calculation to achieve a personalized Web reasoning and computation. Finally, through Web agents' understanding of natural language, they can react to humans more intuitively and thus generate and process information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Historical information is always relevant for clinical trial design. Additionally, if incorporated in the analysis of a new trial, historical data allow to reduce the number of subjects. This decreases costs and trial duration, facilitates recruitment, and may be more ethical. Yet, under prior-data conflict, a too optimistic use of historical data may be inappropriate. We address this challenge by deriving a Bayesian meta-analytic-predictive prior from historical data, which is then combined with the new data. This prospective approach is equivalent to a meta-analytic-combined analysis of historical and new data if parameters are exchangeable across trials. The prospective Bayesian version requires a good approximation of the meta-analytic-predictive prior, which is not available analytically. We propose two- or three-component mixtures of standard priors, which allow for good approximations and, for the one-parameter exponential family, straightforward posterior calculations. Moreover, since one of the mixture components is usually vague, mixture priors will often be heavy-tailed and therefore robust. Further robustness and a more rapid reaction to prior-data conflicts can be achieved by adding an extra weakly-informative mixture component. Use of historical prior information is particularly attractive for adaptive trials, as the randomization ratio can then be changed in case of prior-data conflict. Both frequentist operating characteristics and posterior summaries for various data scenarios show that these designs have desirable properties. We illustrate the methodology for a phase II proof-of-concept trial with historical controls from four studies. Robust meta-analytic-predictive priors alleviate prior-data conflicts ' they should encourage better and more frequent use of historical data in clinical trials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rates for lepton number washout in extensions of the Standard Model containing right-handed neutrinos are key ingredients in scenarios for baryogenesis through leptogenesis. We relate these rates to real-time correlation functions at finite temperature, without making use of any particle approximations. The relations are valid to quadratic order in neutrino Yukawa couplings and to all orders in Standard Model couplings. They take into account all spectator processes, and apply both in the symmetric and in the Higgs phase of the electroweak theory. We use the relations to compute washout rates at next-to-leading order in g, where g denotes a Standard Model gauge or Yukawa coupling, both in the non-relativistic and in the relativistic regime. Even in the non-relativistic regime the parametrically dominant radiative corrections are only suppressed by a single power of g. In the non-relativistic regime radiative corrections increase the washout rate by a few percent at high temperatures, but they are of order unity around the weak scale and in the relativistic regime.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The phenomenon of Christian–Muslim dialogue has had a very chequered history. At varying times, three broad modes of engagement can be said to have operated: antipathy, affinity and inquiry, and these three modes can be found still in today's world. In some places, hostility and antipathy abound. In others, voices and actions express cordial friendship, détente and affinity. In this latter climate, the prospect of engagement in mutual inquiry and cooperative ventures is not only theoretically possible, but actively pursued, and in the first decade of the twenty-first century, a number of notable initiatives in the arena of mutual inquiry have taken place. This article addresses aspects of the context and development of Christian–Muslim dialogue as a modern phenomenon, and then turns to a review of three twenty-first century developments – the Building Bridges seminar series; the Stuttgart-based Christian–Muslim Theological Forum and the “Common Word” letter. It also reflects on the models and theology of dialogue, including not only theology for dialogue, but also theology in and – importantly – after dialogue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a way to incorporate NTBs for the four workhorse models of the modern trade literature in computable general equilibrium models (CGEs). CGE models feature intermediate linkages and thus allow us to study global value chains (GVCs). We show that the Ethier-Krugman monopolistic competition model, the Melitz firm heterogeneity model and the Eaton and Kortum model can be defined as an Armington model with generalized marginal costs, generalized trade costs and a demand externality. As already known in the literature in both the Ethier-Krugman model and the Melitz model generalized marginal costs are a function of the amount of factor input bundles. In the Melitz model generalized marginal costs are also a function of the price of the factor input bundles. Lower factor prices raise the number of firms that can enter the market profitably (extensive margin), reducing generalized marginal costs of a representative firm. For the same reason the Melitz model features a demand externality: in a larger market more firms can enter. We implement the different models in a CGE setting with multiple sectors, intermediate linkages, non-homothetic preferences and detailed data on trade costs. We find the largest welfare effects from trade cost reductions in the Melitz model. We also employ the Melitz model to mimic changes in Non tariff Barriers (NTBs) with a fixed cost-character by analysing the effect of changes in fixed trade costs. While we work here with a model calibrated to the GTAP database, the methods developed can also be applied to CGE models based on the WIOD database.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is important to check the fundamental assumption of most popular Item Response Theory models, unidimensionality. However, it is hard for educational and psychological tests to be strictly unidimensional. The tests studied in this paper are from a standardized high-stake testing program. They feature potential multidimensionality by presenting various item types and item sets. Confirmatory factor analyses with one-factor and bifactor models, and based on both linear structural equation modeling approach and nonlinear IRT approach were conducted. The competing models were compared and the implications of the bifactor model for checking essential unidimensionality were discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary interest was in predicting the distribution runs in a sequence of Bernoulli trials. Difference equation techniques were used to express the number of runs of a given length k in n trials under three assumptions (1) no runs of length greater than k, (2) no runs of length less than k, (3) no other assumptions about the length of runs. Generating functions were utilized to obtain the distributions of the future number of runs, future number of minimum run lengths and future number of the maximum run lengths unconditional on the number of successes and failures in the Bernoulli sequence. When applying the model to Texas hydrology data, the model provided an adequate fit for the data in eight of the ten regions. Suggested health applications of this approach to run theory are provided. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We experimentally and numerically investigated the generation of plumes from a local heat source (LHS) and studied the interaction of these plumes with cellular convective motion (CCM) in a rectangular cavity filled with silicon oil at a Prandtl number (Pr) of approximately two thousand. The LHS is generated using a 0.2-W green laser beam. A roll-type CCM is generated by vertically heating one side of the cavity. The CCM may lead to the formation of an unusual spiral convective plume that resembles a vertical Archimedes spiral. A similar plume is obtained in a direct numerical simulation. We discuss the physical mechanism for the formation of a spiral plume and the application of the results to mantle convection problems. We also estimate the Reynolds (Re) and Rayleigh (Ra) numbers and apply self-similarity theory to convection in the Earth's mantle. Spiral plumes can be used to interpret mantle tomography results over the last decade.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ecological theory of adaptive radiation predicts that the evolution of phenotypic diversity within species is generated by divergent natural selection arising from different environments and competition between species. Genetic connectivity among populations is likely also to have an important role in both the origin and maintenance of adaptive genetic diversity. Our goal was to evaluate the potential roles of genetic connectivity and natural selection in the maintenance of adaptive phenotypic differences among morphs of Arctic charr, Salvelinus alpinus, in Iceland. At a large spatial scale, we tested the predictive power of geographic structure and phenotypic variation for patterns of neutral genetic variation among populations throughout Iceland. At a smaller scale, we evaluated the genetic differentiation between two morphs in Lake Thingvallavatn relative to historically explicit, coalescent-based null models of the evolutionary history of these lineages. At the large spatial scale, populations are highly differentiated, but weakly structured, both geographically and with respect to patterns of phenotypic variation. At the intralacustrine scale, we observe modest genetic differentiation between two morphs, but this level of differentiation is nonetheless consistent with strong reproductive isolation throughout the Holocene. Rather than a result of the homogenizing effect of gene flow in a system at migration-drift equilibrium, the modest level of genetic differentiation could equally be a result of slow neutral divergence by drift in large populations. We conclude that contemporary and recent patterns of restricted gene flow have been highly conducive to the evolution and maintenance of adaptive genetic variation in Icelandic Arctic charr.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The classical Kramer sampling theorem provides a method for obtaining orthogonal sampling formulas. In particular, when the involved kernel is analytic in the sampling parameter it can be stated in an abstract setting of reproducing kernel Hilbert spaces of entire functions which includes as a particular case the classical Shannon sampling theory. This abstract setting allows us to obtain a sort of converse result and to characterize when the sampling formula associated with an analytic Kramer kernel can be expressed as a Lagrange-type interpolation series. On the other hand, the de Branges spaces of entire functions satisfy orthogonal sampling formulas which can be written as Lagrange-type interpolation series. In this work some links between all these ideas are established.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Within the framework of the Collaborative Project for a European Sodium Fast Reactor, the reactor physics group at UPM is working on the extension of its in-house multi-scale advanced deterministic code COBAYA3 to Sodium Fast Reactors (SFR). COBAYA3 is a 3D multigroup neutron kinetics diffusion code that can be used either as a pin-by-pin code or as a stand-alone nodal code by using the analytic nodal diffusion solver ANDES. It is coupled with thermalhydraulics codes such as COBRA-TF and FLICA, allowing transient analysis of LWR at both fine-mesh and coarse-mesh scales. In order to enable also 3D pin-by-pin and nodal coupled NK-TH simulations of SFR, different developments are in progress. This paper presents the first steps towards the application of COBAYA3 to this type of reactors. ANDES solver, already extended to triangular-Z geometry, has been applied to fast reactor steady-state calculations. The required cross section libraries were generated with ERANOS code for several configurations. The limitations encountered in the application of the Analytic Coarse Mesh Finite Difference (ACMFD) method –implemented inside ANDES– to fast reactors are presented and the sensitivity of the method when using a high number of energy groups is studied. ANDES performance is assessed by comparison with the results provided by ERANOS, using a mini-core model in 33 energy groups. Furthermore, a benchmark from the NEA for a small 3D FBR in hexagonal-Z geometry and 4 energy groups is also employed to verify the behavior of the code with few energy groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of inertial confinement fusion is the production of energy by the fusion of thermonuclear fuel (deuterium-tritium) enclosed in a spherical target due to its implosion. In the direct-drive approach, the energy needed to spark fusion reactions is delivered by the irradiation of laser beams that leads to the ablation of the outer shell of the target (the so-called ablator). As a reaction to this ablation process, the target is accelerated inwards, and, provided that this implosion is sufficiently strong a symmetric, the requirements of temperature and pressure in the center of the target are achieved leading to the ignition of the target (fusion). One of the obstacles capable to prevent appropriate target implosions takes place in the ablation region where any perturbation can grow even causing the ablator shell break, due to the ablative Rayleigh-Taylor instability. The ablative Rayleigh-Taylor instability has been extensively studied throughout the last 40 years in the case where the density/temperature profiles in the ablation region present a single front (the ablation front). Single ablation fronts appear when the ablator material has a low atomic number (deuterium/tritium ice, plastic). In this case, the main mechanism of energy transport from the laser energy absorption region (low density plasma) to the ablation region is the electron thermal conduction. However, recently, the use of materials with a moderate atomic number (silica, doped plastic) as ablators, with the aim of reducing the target pre-heating caused by suprathermal electrons generated by the laser-plasma interaction, has demonstrated an ablation region composed of two ablation fronts. This fact appears due to increasing importance of radiative effects in the energy transport. The linear theory describing the Rayleigh-Taylor instability for single ablation fronts cannot be applied for the stability analysis of double ablation front structures. Therefore, the aim of this thesis is to develop, for the first time, a linear stability theory for this type of hydrodynamic structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An approximate analytic model of a shared memory multiprocessor with a Cache Only Memory Architecture (COMA), the busbased Data Difussion Machine (DDM), is presented and validated. It describes the timing and interference in the system as a function of the hardware, the protocols, the topology and the workload. Model results have been compared to results from an independent simulator. The comparison shows good model accuracy specially for non-saturated systems, where the errors in response times and device utilizations are independent of the number of processors and remain below 10% in 90% of the simulations. Therefore, the model can be used as an average performance prediction tool that avoids expensive simulations in the design of systems with many processors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel algorithm based on bimatrix game theory has been developed to improve the accuracy and reliability of a speaker diarization system. This algorithm fuses the output data of two open-source speaker diarization programs, LIUM and SHoUT, taking advantage of the best properties of each one. The performance of this new system has been tested by means of audio streams from several movies. From preliminary results on fragments of five movies, improvements of 63% in false alarms and missed speech mistakes have been achieved with respect to LIUM and SHoUT systems working alone. Moreover, we also improve in a 20% the number of recognized speakers, getting close to the real number of speakers in the audio stream

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analytical expressions for current to a cylindrical Langmuir probe at rest in unmagnetized plasma are compared with results from both steady-state Vlasov and particle-in-cell simulations. Probe bias potentials that are much greater than plasma temperature (assumed equal for ions and electrons), as of interest for bare conductive tethers, are considered. At a very high bias, both the electric potential and the attracted-species density exhibit complex radial profiles; in particular, the density exhibits a minimum well within the plasma sheath and a maximum closer to the probe. Excellent agreement is found between analytical and numerical results for values of the probe radiusR close to the maximum radius Rmax for orbital-motion-limited (OML) collection at a particular bias in the following number of profile features: the values and positions of density minimum and maximum, position of sheath boundary, and value of a radius characterizing the no-space-charge behavior of a potential near the high-bias probe. Good agreement between the theory and simulations is also found for parametric laws jointly covering the following three characteristic R ranges: sheath radius versus probe radius and bias for Rmax; density minimum versus probe bias for Rmax; and (weakly bias-dependent) current drop below the OML value versus the probe radius for R > Rmax.