99 resultados para SQL query equivalence

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article is about modeling count data with zero truncation. A parametric count density family is considered. The truncated mixture of densities from this family is different from the mixture of truncated densities from the same family. Whereas the former model is more natural to formulate and to interpret, the latter model is theoretically easier to treat. It is shown that for any mixing distribution leading to a truncated mixture, a (usually different) mixing distribution can be found so. that the associated mixture of truncated densities equals the truncated mixture, and vice versa. This implies that the likelihood surfaces for both situations agree, and in this sense both models are equivalent. Zero-truncated count data models are used frequently in the capture-recapture setting to estimate population size, and it can be shown that the two Horvitz-Thompson estimators, associated with the two models, agree. In particular, it is possible to achieve strong results for mixtures of truncated Poisson densities, including reliable, global construction of the unique NPMLE (nonparametric maximum likelihood estimator) of the mixing distribution, implying a unique estimator for the population size. The benefit of these results lies in the fact that it is valid to work with the mixture of truncated count densities, which is less appealing for the practitioner but theoretically easier. Mixtures of truncated count densities form a convex linear model, for which a developed theory exists, including global maximum likelihood theory as well as algorithmic approaches. Once the problem has been solved in this class, it might readily be transformed back to the original problem by means of an explicitly given mapping. Applications of these ideas are given, particularly in the case of the truncated Poisson family.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The need for consistent assimilation of satellite measurements for numerical weather prediction led operational meteorological centers to assimilate satellite radiances directly using variational data assimilation systems. More recently there has been a renewed interest in assimilating satellite retrievals (e.g., to avoid the use of relatively complicated radiative transfer models as observation operators for data assimilation). The aim of this paper is to provide a rigorous and comprehensive discussion of the conditions for the equivalence between radiance and retrieval assimilation. It is shown that two requirements need to be satisfied for the equivalence: (i) the radiance observation operator needs to be approximately linear in a region of the state space centered at the retrieval and with a radius of the order of the retrieval error; and (ii) any prior information used to constrain the retrieval should not underrepresent the variability of the state, so as to retain the information content of the measurements. Both these requirements can be tested in practice. When these requirements are met, retrievals can be transformed so as to represent only the portion of the state that is well constrained by the original radiance measurements and can be assimilated in a consistent and optimal way, by means of an appropriate observation operator and a unit matrix as error covariance. Finally, specific cases when retrieval assimilation can be more advantageous (e.g., when the estimate sought by the operational assimilation system depends on the first guess) are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We prove the equivalence of three weak formulations of the steady water waves equations, namely: the velocity formulation, the stream function formulation and the Dubreil-Jacotin formulation, under weak Hölder regularity assumptions on their solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two sources of bias arise in conventional loss predictions in the wake of natural disasters. One source of bias stems from neglect of accounting for animal genetic resource loss. A second source of bias stems from failure to identify, in addition to the direct effects of such loss, the indirect effects arising from implications impacting animal-human interactions. We argue that, in some contexts, the magnitude of bias imputed by neglecting animal genetic resource stocks is substantial. We show, in addition, and contrary to popular belief, that the biases attributable to losses in distinct genetic resource stocks are very likely to be the same. We derive the formal equivalence across the distinct resource stocks by deriving an envelope result in a model that forms the mainstay of enquiry in subsistence farming and we validate the theory, empirically, in a World-Society-for-the-Protection-of-Animals application

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this brief note we prove orbifold equivalence between two potentials described by strangely dual exceptional unimodular singularities of type K14 and Q10 in two different ways. The matrix factorizations proving the orbifold equivalence give rise to equations whose solutions are permuted by Galois groups which differ for different expressions of the same singularity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we consider the 2D Dirichlet boundary value problem for Laplace’s equation in a non-locally perturbed half-plane, with data in the space of bounded and continuous functions. We show uniqueness of solution, using standard Phragmen-Lindelof arguments. The main result is to propose a boundary integral equation formulation, to prove equivalence with the boundary value problem, and to show that the integral equation is well posed by applying a recent partial generalisation of the Fredholm alternative in Arens et al [J. Int. Equ. Appl. 15 (2003) pp. 1-35]. This then leads to an existence proof for the boundary value problem. Keywords. Boundary integral equation method, Water waves, Laplace’s

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most parameterizations for precipitating convection in use today are bulk schemes, in which an ensemble of cumulus elements with different properties is modelled as a single, representative entraining-detraining plume. We review the underpinning mathematical model for such parameterizations, in particular by comparing it with spectral models in which elements are not combined into the representative plume. The chief merit of a bulk model is that the representative plume can be described by an equation set with the same structure as that which describes each element in a spectral model. The equivalence relies on an ansatz for detrained condensate introduced by Yanai et al. (1973) and on a simplified microphysics. There are also conceptual differences in the closure of bulk and spectral parameterizations. In particular, we show that the convective quasi-equilibrium closure of Arakawa and Schubert (1974) for spectral parameterizations cannot be carried over to a bulk parameterization in a straightforward way. Quasi-equilibrium of the cloud work function assumes a timescale separation between a slow forcing process and a rapid convective response. But, for the natural bulk analogue to the cloud-work function (the dilute CAPE), the relevant forcing is characterised by a different timescale, and so its quasi-equilibrium entails a different physical constraint. Closures of bulk parameterization that use the non-entraining parcel value of CAPE do not suffer from this timescale issue. However, the Yanai et al. (1973) ansatz must be invoked as a necessary ingredient of those closures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we present an architecture for network and applications management, which is based on the Active Networks paradigm and shows the advantages of network programmability. The stimulus to develop this architecture arises from an actual need to manage a cluster of active nodes, where it is often required to redeploy network assets and modify nodes connectivity. In our architecture, a remote front-end of the managing entity allows the operator to design new network topologies, to check the status of the nodes and to configure them. Moreover, the proposed framework allows to explore an active network, to monitor the active applications, to query each node and to install programmable traps. In order to take advantage of the Active Networks technology, we introduce active SNMP-like MIBs and agents, which are dynamic and programmable. The programmable management agents make tracing distributed applications a feasible task. We propose a general framework that can inter-operate with any active execution environment. In this framework, both the manager and the monitor front-ends communicate with an active node (the Active Network Access Point) through the XML language. A gateway service performs the translation of the queries from XML to an active packet language and injects the code in the network. We demonstrate the implementation of an active network gateway for PLAN (Packet Language for Active Networks) in a forty active nodes testbed. Finally, we discuss an application of the active management architecture to detect the causes of network failures by tracing network events in time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mainframes, corporate and central servers are becoming information servers. The requirement for more powerful information servers is the best opportunity to exploit the potential of parallelism. ICL recognized the opportunity of the 'knowledge spectrum' namely to convert raw data into information and then into high grade knowledge. Parallel Processing and Data Management Its response to this and to the underlying search problems was to introduce the CAFS retrieval engine. The CAFS product demonstrates that it is possible to move functionality within an established architecture, introduce a different technology mix and exploit parallelism to achieve radically new levels of performance. CAFS also demonstrates the benefit of achieving this transparently behind existing interfaces. ICL is now working with Bull and Siemens to develop the information servers of the future by exploiting new technologies as available. The objective of the joint Esprit II European Declarative System project is to develop a smoothly scalable, highly parallel computer system, EDS. EDS will in the main be an SQL server and an information server. It will support the many data-intensive applications which the companies foresee; it will also support application-intensive and logic-intensive systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In general, ranking entities (resources) on the Semantic Web (SW) is subject to importance, relevance, and query length. Few existing SW search systems cover all of these aspects. Moreover, many existing efforts simply reuse the technologies from conventional Information Retrieval (IR), which are not designed for SW data. This paper proposes a ranking mechanism, which includes all three categories of rankings and are tailored to SW data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The double triangular test was introduced twenty years ago, and the purpose of this paper is to review applications that have been made since then. In fact, take-up of the method was rather slow until the late 1990s, but in recent years several clinical trial reports have been published describing its use in a wide range of therapeutic areas. The core of this paper is a detailed account of five trials that have been published since 2000 in which the method was applied to studies of pancreatic cancer, breast cancer, myocardial infarction, epilepsy and bedsores. Before those accounts are given, the method is described and the history behind its evolution is presented. The future potential of the method for sequential case-control and equivalence trials is also discussed. Copyright © 2004 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The emergence in 2009 of a swine-origin H1N1 influenza virus as the first pandemic of the 21st Century is a timely reminder of the international public health impact of influenza viruses, even those associated with mild disease. The widespread distribution of highly pathogenic H5N1 influenza virus in the avian population has spawned concern that it may give rise to a human influenza pandemic. The mortality rate associated with occasional human infection by H5N1 virus approximates 60%, suggesting that an H5N1 pandemic would be devastating to global health and economy. To date, the H5N1 virus has not acquired the propensity to transmit efficiently between humans. The reasons behind this are unclear, especially given the high mutation rate associated with influenza virus replication. Here we used a panel of recombinant H5 hemagglutinin (HA) variants to demonstrate the potential for H5 HA to bind human airway epithelium, the predominant target tissue for influenza virus infection and spread. While parental H5 HA exhibited limited binding to human tracheal epithelium, introduction of selected mutations converted the binding profile to that of a current human influenza strain HA. Strikingly, these amino-acid changes required multiple simultaneous mutations in the genomes of naturally occurring H5 isolates. Moreover, H5 HAs bearing intermediate sequences failed to bind airway tissues and likely represent mutations that are an evolutionary "dead end." We conclude that, although genetic changes that adapt H5 to human airways can be demonstrated, they may not readily arise during natural virus replication. This genetic barrier limits the likelihood that current H5 viruses will originate a human pandemic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The mechanism of action and properties of a solid-phase ligand library made of hexapeptides (combinatorial peptide ligand libraries or CPLL), for capturing the "hidden proteome", i.e. the low- and very low-abundance proteins constituting the vast majority of species in any proteome, as applied to plant tissues, are reviewed here. Plant tissues are notoriously recalcitrant to protein extraction and to proteome analysis. Firstly, rigid plant cell walls need to be mechanically disrupted to release the cell content and, in addition to their poor protein yield, plant tissues are rich in proteases and oxidative enzymes, contain phenolic compounds, starches, oils, pigments and secondary metabolites that massively contaminate protein extracts. In addition, complex matrices of polysaccharides, including large amount of anionic pectins, are present. All these species compete with the binding of proteins to the CPLL beads, impeding proper capture and identification / detection of low-abundance species. When properly pre-treated, plant tissue extracts are amenable to capture by the CPLL beads revealing thus many new species among them low-abundance proteins. Examples are given on the treatment of leaf proteins, of corn seed extracts and of exudate proteins (latex from Hevea brasiliensis). In all cases, the detection of unique gene products via CPLL capture is at least twice that of control, untreated sample.