952 resultados para Gegenbauer’s Polynomial
Resumo:
We have used the Two-Degree Field (2dF) instrument on the Anglo-Australian Telescope (AAT) to obtain redshifts of a sample of z < 3 and 18.0 < g < 21.85 quasars selected from Sloan Digital Sky Survey (SDSS) imaging. These data are part of a larger joint programme between the SDSS and 2dF communities to obtain spectra of faint quasars and luminous red galaxies, namely the 2dF-SDSS LRG and QSO (2SLAQ) Survey. We describe the quasar selection algorithm and present the resulting number counts and luminosity function of 5645 quasars in 105.7 deg(2). The bright-end number counts and luminosity functions agree well with determinations from the 2dF QSO Redshift Survey (2QZ) data to g similar to 20.2. However, at the faint end, the 2SLAQ number counts and luminosity functions are steeper (i.e. require more faint quasars) than the final 2QZ results from Croom et al., but are consistent with the preliminary 2QZ results from Boyle et al. Using the functional form adopted for the 2QZ analysis ( a double power law with pure luminosity evolution characterized by a second-order polynomial in redshift), we find a faint-end slope of beta =-1.78 +/- 0.03 if we allow all of the parameters to vary, and beta =-1.45 +/- 0.03 if we allow only the faint-end slope and normalization to vary (holding all other parameters equal to the final 2QZ values). Over the magnitude range covered by the 2SLAQ survey, our maximum-likelihood fit to the data yields 32 per cent more quasars than the final 2QZ parametrization, but is not inconsistent with other g > 21 deep surveys for quasars. The 2SLAQ data exhibit no well-defined 'break' in the number counts or luminosity function, but do clearly flatten with increasing magnitude. Finally, we find that the shape of the quasar luminosity function derived from 2SLAQ is in good agreement with that derived from Type I quasars found in hard X-ray surveys.
Resumo:
The Wet Tropics World Heritage Area in Far North Queens- land, Australia consists predominantly of tropical rainforest and wet sclerophyll forest in areas of variable relief. Previous maps of vegetation communities in the area were produced by a labor-intensive combination of field survey and air-photo interpretation. Thus,. the aim of this work was to develop a new vegetation mapping method based on imaging radar that incorporates topographical corrections, which could be repeated frequently, and which would reduce the need for detailed field assessments and associated costs. The method employed G topographic correction and mapping procedure that was developed to enable vegetation structural classes to be mapped from satellite imaging radar. Eight JERS-1 scenes covering the Wet Tropics area for 1996 were acquired from NASDA under the auspices of the Global Rainforest Mapping Project. JERS scenes were geometrically corrected for topographic distortion using an 80 m DEM and a combination of polynomial warping and radar viewing geometry modeling. An image mosaic was created to cover the Wet Tropics region, and a new technique for image smoothing was applied to the JERS texture bonds and DEM before a Maximum Likelihood classification was applied to identify major land-cover and vegetation communities. Despite these efforts, dominant vegetation community classes could only be classified to low levels of accuracy (57.5 percent) which were partly explained by the significantly larger pixel size of the DEM in comparison to the JERS image (12.5 m). In addition, the spatial and floristic detail contained in the classes of the original validation maps were much finer than the JERS classification product was able to distinguish. In comparison to field and aerial photo-based approaches for mapping the vegetation of the Wet Tropics, appropriately corrected SAR data provides a more regional scale, all-weather mapping technique for broader vegetation classes. Further work is required to establish an appropriate combination of imaging radar with elevation data and other environmental surrogates to accurately map vegetation communities across the entire Wet Tropics.
Resumo:
The Cunningham project seeks to factor numbers of the form bn±1 with b = 2, 3, . . . small. One of the most useful techniques is Aurifeuillian Factorization whereby such a number is partially factored by replacing bn by a polynomial in such a way that polynomial factorization is possible. For example, by substituting y = 2k into the polynomial factorization (2y2)2+1 = (2y2−2y+1)(2y2+2y+1) we can partially factor 24k+2+1. In 1962 Schinzel gave a list of such identities that have proved useful in the Cunningham project; we believe that Schinzel identified all numbers that can be factored by such identities and we prove this if one accepts our definition of what “such an identity” is. We then develop our theme to similarly factor f(bn) for any given polynomial f, using deep results of Faltings from algebraic geometry and Fried from the classification of finite simple groups.
Resumo:
Standard factorial designs sometimes may be inadequate for experiments that aim to estimate a generalized linear model, for example, for describing a binary response in terms of several variables. A method is proposed for finding exact designs for such experiments that uses a criterion allowing for uncertainty in the link function, the linear predictor, or the model parameters, together with a design search. Designs are assessed and compared by simulation of the distribution of efficiencies relative to locally optimal designs over a space of possible models. Exact designs are investigated for two applications, and their advantages over factorial and central composite designs are demonstrated.
Resumo:
Allocations of research funds across programs are often made for efficiency reasons. Social science research is shown to have small, lagged but significant effects on U.S. agricultural efficiency when public agricultural R&D and extension are simultaneously taken into account. Farm management and marketing research variables are used to explain variations in estimates of allocative and technical efficiency using a Bayesian approach that incorporates stylized facts concerning lagged research impacts in a way that is less restrictive than popular polynomial distributed lags. Results are reported in terms of means and standard deviations of estimated probability distributions of parameters and long-run total multipliers. Extension is estimated to have a greater impact on both allocative and technical efficiency than either R&D or social science research.
Resumo:
Using generalized collocation techniques based on fitting functions that are trigonometric (rather than algebraic as in classical integrators), we develop a new class of multistage, one-step, variable stepsize, and variable coefficients implicit Runge-Kutta methods to solve oscillatory ODE problems. The coefficients of the methods are functions of the frequency and the stepsize. We refer to this class as trigonometric implicit Runge-Kutta (TIRK) methods. They integrate an equation exactly if its solution is a trigonometric polynomial with a known frequency. We characterize the order and A-stability of the methods and establish results similar to that of classical algebraic collocation RK methods. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Hannenhalli and Pevzner developed the first polynomial-time algorithm for the combinatorial problem of sorting of signed genomic data. Their algorithm solves the minimum number of reversals required for rearranging a genome to another when gene duplication is nonexisting. In this paper, we show how to extend the Hannenhalli-Pevzner approach to genomes with multigene families. We propose a new heuristic algorithm to compute the reversal distance between two genomes with multigene families via the concept of binary integer programming without removing gene duplicates. The experimental results on simulated and real biological data demonstrate that the proposed algorithm is able to find the reversal distance accurately. ©2005 IEEE