952 resultados para Gegenbauer’s Polynomial
Resumo:
Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. The problem faced in this framework is that of Multiple Target Tracking (MTT). In this context both the correct associations among the observations, and the orbits of the objects have to be determined. The complexity of the MTT problem is defined by its dimension S. Where S stands for the number of ’fences’ used in the problem, each fence consists of a set of observations that all originate from dierent targets. For a dimension of S ˃ the MTT problem becomes NP-hard. As of now no algorithm exists that can solve an NP-hard problem in an optimal manner within a reasonable (polynomial) computation time. However, there are algorithms that can approximate the solution with a realistic computational e ort. To this end an Elitist Genetic Algorithm is implemented to approximately solve the S ˃ MTT problem in an e cient manner. Its complexity is studied and it is found that an approximate solution can be obtained in a polynomial time. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the correlation and orbit determination problems simultaneously, and is able to e ciently process large data sets with minimal manual intervention.
Resumo:
We prove exponential rates of convergence of hp-version discontinuous Galerkin (dG) interior penalty finite element methods for second-order elliptic problems with mixed Dirichlet-Neumann boundary conditions in axiparallel polyhedra. The dG discretizations are based on axiparallel, σ-geometric anisotropic meshes of mapped hexahedra and anisotropic polynomial degree distributions of μ-bounded variation. We consider piecewise analytic solutions which belong to a larger analytic class than those for the pure Dirichlet problem considered in [11, 12]. For such solutions, we establish the exponential convergence of a nonconforming dG interpolant given by local L 2 -projections on elements away from corners and edges, and by suitable local low-order quasi-interpolants on elements at corners and edges. Due to the appearance of non-homogeneous, weighted norms in the analytic regularity class, new arguments are introduced to bound the dG consistency errors in elements abutting on Neumann edges. The non-homogeneous norms also entail some crucial modifications of the stability and quasi-optimality proofs, as well as of the analysis for the anisotropic interpolation operators. The exponential convergence bounds for the dG interpolant constructed in this paper generalize the results of [11, 12] for the pure Dirichlet case.
Resumo:
Steiner’s tube formula states that the volume of an ϵ-neighborhood of a smooth regular domain in Rn is a polynomial of degree n in the variable ϵ whose coefficients are curvature integrals (also called quermassintegrals). We prove a similar result in the sub-Riemannian setting of the first Heisenberg group. In contrast to the Euclidean setting, we find that the volume of an ϵ-neighborhood with respect to the Heisenberg metric is an analytic function of ϵ that is generally not a polynomial. The coefficients of the series expansion can be explicitly written in terms of integrals of iteratively defined canonical polynomials of just five curvature terms.
Resumo:
A three-level satellite to ground monitoring scheme for conservation easement monitoring has been implemented in which high-resolution imagery serves as an intermediate step for inspecting high priority sites. A digital vertical aerial camera system was developed to fulfill the need for an economical source of imagery for this intermediate step. A method for attaching the camera system to small aircraft was designed, and the camera system was calibrated and tested. To ensure that the images obtained were of suitable quality for use in Level 2 inspections, rectified imagery was required to provide positional accuracy of 5 meters or less to be comparable to current commercially available high-resolution satellite imagery. Focal length calibration was performed to discover the infinity focal length at two lens settings (24mm and 35mm) with a precision of O.1mm. Known focal length is required for creation of navigation points representing locations to be photographed (waypoints). Photographing an object of known size at distances on a test range allowed estimates of focal lengths of 25.lmm and 35.4mm for the 24mm and 35mm lens settings, respectively. Constants required for distortion removal procedures were obtained using analytical plumb-line calibration procedures for both lens settings, with mild distortion at the 24mm setting and virtually no distortion found at the 35mm setting. The system was designed to operate in a series of stages: mission planning, mission execution, and post-mission processing. During mission planning, waypoints were created using custom tools in geographic information system (GIs) software. During mission execution, the camera is connected to a laptop computer with a global positioning system (GPS) receiver attached. Customized mobile GIs software accepts position information from the GPS receiver, provides information for navigation, and automatically triggers the camera upon reaching the desired location. Post-mission processing (rectification) of imagery for removal of lens distortion effects, correction of imagery for horizontal displacement due to terrain variations (relief displacement), and relating the images to ground coordinates were performed with no more than a second-order polynomial warping function. Accuracy testing was performed to verify the positional accuracy capabilities of the system in an ideal-case scenario as well as a real-world case. Using many welldistributed and highly accurate control points on flat terrain, the rectified images yielded median positional accuracy of 0.3 meters. Imagery captured over commercial forestland with varying terrain in eastern Maine, rectified to digital orthophoto quadrangles, yielded median positional accuracies of 2.3 meters with accuracies of 3.1 meters or better in 75 percent of measurements made. These accuracies were well within performance requirements. The images from the digital camera system are of high quality, displaying significant detail at common flying heights. At common flying heights the ground resolution of the camera system ranges between 0.07 meters and 0.67 meters per pixel, satisfying the requirement that imagery be of comparable resolution to current highresolution satellite imagery. Due to the high resolution of the imagery, the positional accuracy attainable, and the convenience with which it is operated, the digital aerial camera system developed is a potentially cost-effective solution for use in the intermediate step of a satellite to ground conservation easement monitoring scheme.
Resumo:
An introduction to Legendre polynomials as precursor to studying angular momentum in quantum chemistry,
Resumo:
The Frobenius solution to the differential equations associated with the harmonic oscillator (QM) is carried out in detail.
Resumo:
A single-issue spatial election is a voter preference profile derived from an arrangement of candidates and voters on a line, with each voter preferring the nearer of each pair of candidates. We provide a polynomial-time algorithm that determines whether a given preference profile is a single-issue spatial election and, if so, constructs such an election. This result also has preference representation and mechanism design applications.
Resumo:
The radial part of the Schrodinger Equation for the H-atom's electron involves Laguerre polynomials, hence this introduction.
Resumo:
A standard treatment of aspects of Legendre polynomials is treated here, including the dipole moment expansion, generating functions, etc..
Resumo:
A characterization of a property of binary relations is of finite type if it is stated in terms of ordered T-tuples of alternatives for some positive integer T. A characterization of finite type can be used to determine in polynomial time whether a binary relation over a finite set has the property characterized. Unfortunately, Pareto representability in R2 has no characterization of finite type (Knoblauch, 2002). This result is generalized below Rl, l larger than 2. The method of proof is applied to other properties of binary relations.
Resumo:
The joint modeling of longitudinal and survival data is a new approach to many applications such as HIV, cancer vaccine trials and quality of life studies. There are recent developments of the methodologies with respect to each of the components of the joint model as well as statistical processes that link them together. Among these, second order polynomial random effect models and linear mixed effects models are the most commonly used for the longitudinal trajectory function. In this study, we first relax the parametric constraints for polynomial random effect models by using Dirichlet process priors, then three longitudinal markers rather than only one marker are considered in one joint model. Second, we use a linear mixed effect model for the longitudinal process in a joint model analyzing the three markers. In this research these methods were applied to the Primary Biliary Cirrhosis sequential data, which were collected from a clinical trial of primary biliary cirrhosis (PBC) of the liver. This trial was conducted between 1974 and 1984 at the Mayo Clinic. The effects of three longitudinal markers (1) Total Serum Bilirubin, (2) Serum Albumin and (3) Serum Glutamic-Oxaloacetic transaminase (SGOT) on patients' survival were investigated. Proportion of treatment effect will also be studied using the proposed joint modeling approaches. ^ Based on the results, we conclude that the proposed modeling approaches yield better fit to the data and give less biased parameter estimates for these trajectory functions than previous methods. Model fit is also improved after considering three longitudinal markers instead of one marker only. The results from analysis of proportion of treatment effects from these joint models indicate same conclusion as that from the final model of Fleming and Harrington (1991), which is Bilirubin and Albumin together has stronger impact in predicting patients' survival and as a surrogate endpoints for treatment. ^
Resumo:
Objectives. This paper seeks to assess the effect on statistical power of regression model misspecification in a variety of situations. ^ Methods and results. The effect of misspecification in regression can be approximated by evaluating the correlation between the correct specification and the misspecification of the outcome variable (Harris 2010).In this paper, three misspecified models (linear, categorical and fractional polynomial) were considered. In the first section, the mathematical method of calculating the correlation between correct and misspecified models with simple mathematical forms was derived and demonstrated. In the second section, data from the National Health and Nutrition Examination Survey (NHANES 2007-2008) were used to examine such correlations. Our study shows that comparing to linear or categorical models, the fractional polynomial models, with the higher correlations, provided a better approximation of the true relationship, which was illustrated by LOESS regression. In the third section, we present the results of simulation studies that demonstrate overall misspecification in regression can produce marked decreases in power with small sample sizes. However, the categorical model had greatest power, ranging from 0.877 to 0.936 depending on sample size and outcome variable used. The power of fractional polynomial model was close to that of linear model, which ranged from 0.69 to 0.83, and appeared to be affected by the increased degrees of freedom of this model.^ Conclusion. Correlations between alternative model specifications can be used to provide a good approximation of the effect on statistical power of misspecification when the sample size is large. When model specifications have known simple mathematical forms, such correlations can be calculated mathematically. Actual public health data from NHANES 2007-2008 were used as examples to demonstrate the situations with unknown or complex correct model specification. Simulation of power for misspecified models confirmed the results based on correlation methods but also illustrated the effect of model degrees of freedom on power.^
Resumo:
Hierarchical linear growth model (HLGM), as a flexible and powerful analytic method, has played an increased important role in psychology, public health and medical sciences in recent decades. Mostly, researchers who conduct HLGM are interested in the treatment effect on individual trajectories, which can be indicated by the cross-level interaction effects. However, the statistical hypothesis test for the effect of cross-level interaction in HLGM only show us whether there is a significant group difference in the average rate of change, rate of acceleration or higher polynomial effect; it fails to convey information about the magnitude of the difference between the group trajectories at specific time point. Thus, reporting and interpreting effect sizes have been increased emphases in HLGM in recent years, due to the limitations and increased criticisms for statistical hypothesis testing. However, most researchers fail to report these model-implied effect sizes for group trajectories comparison and their corresponding confidence intervals in HLGM analysis, since lack of appropriate and standard functions to estimate effect sizes associated with the model-implied difference between grouping trajectories in HLGM, and also lack of computing packages in the popular statistical software to automatically calculate them. ^ The present project is the first to establish the appropriate computing functions to assess the standard difference between grouping trajectories in HLGM. We proposed the two functions to estimate effect sizes on model-based grouping trajectories difference at specific time, we also suggested the robust effect sizes to reduce the bias of estimated effect sizes. Then, we applied the proposed functions to estimate the population effect sizes (d ) and robust effect sizes (du) on the cross-level interaction in HLGM by using the three simulated datasets, and also we compared the three methods of constructing confidence intervals around d and du recommended the best one for application. At the end, we constructed 95% confidence intervals with the suitable method for the effect sizes what we obtained with the three simulated datasets. ^ The effect sizes between grouping trajectories for the three simulated longitudinal datasets indicated that even though the statistical hypothesis test shows no significant difference between grouping trajectories, effect sizes between these grouping trajectories can still be large at some time points. Therefore, effect sizes between grouping trajectories in HLGM analysis provide us additional and meaningful information to assess group effect on individual trajectories. In addition, we also compared the three methods to construct 95% confident intervals around corresponding effect sizes in this project, which handled with the uncertainty of effect sizes to population parameter. We suggested the noncentral t-distribution based method when the assumptions held, and the bootstrap bias-corrected and accelerated method when the assumptions are not met.^
Resumo:
Se aplicó un nuevo método para la evaluación objetiva del color en aceitunas de mesa, basado en el análisis de la intensidad de reflexión de cada uno de los colores primarios que componen la luz blanca (rojo, verde y azul), según las longitudes de onda del Sistema RGB. Se trabajó con programas informáticos para el análisis de imágenes digitales color tipo BMP de 24 bits. Este trabajo proporciona mayor información sobre el pardeamiento de las aceitunas naturales en salmuera, lo que sería muy útil para incrementar la efectividad del proceso. El método propuesto es rápido y no destructivo, prometiendo ser muy práctico ya que permite que una misma muestra pueda ser evaluada en el tiempo. Se investigaron los cambios de color en aceitunas elaboradas naturalmente, con diferentes grados de madurez (pintas, rojas y negras) y a diferentes valores de pH (3,6 - 4,0 - 4,5), expuestas al aire durante períodos crecientes de tiempo. Se cuantificó el grado de oscurecimiento a través de Índices de Intensidad de Reflexión. La evolución del índice de reflexión en función del tiempo generó una curva polinomial de 4° grado que reveló el comportamiento sigmoidal del fenómeno de pardeamiento enzimático, con la máxima correlación a las 8 horas de aireación. Esta función permitiría predecir el fenómeno de pardeamiento en las aceitunas negras y representa una medición objetiva del grado relativo de pardeamiento. La evolución del color rojo (λ = 700,0 nm) exhibió la mayor correlación con el proceso de pardeamiento. Las aceitunas rojas naturales a pH 4,5 presentaron óptimo pardeamiento. El espectro de reflexión para el color azul (λ = 435,8 nm) se sugiere como medida de la actividad de la enzima PPO (polifenoloxidasa).
Resumo:
The Palestine Exploration Fund (PEF) Survey of Western Palestine (1871-1877) is highly praised for its accuracy and completeness; the first systematic analysis of its planimetric accuracy was published by Levin (2006). To study the potential of these 1:63,360 maps for a quantitative analysis of land cover changes over a period of time, Levin has compared them to 20th century topographic maps. The map registration error of the PEF maps was 74.4 m using 123 control points of trigonometrical stations and a 1st order polynomial. The median RMSE of all control and test points (n = 1104) was 153.6 m. Following the georeferencing of each of the 26 sheets of the PEF maps of the Survey of Western Palestine, a mosaicked file has been created. Care should be taken when analysing historical maps, as it cannot be assumed that their accuracy is consistent at different parts or for different features depicted on them.