205 resultados para SPLINES


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In many designed experiments with animals liveweight is recorded several times during the trial. Such data are commonly referred to as repeated measures data. An aim of such experiments is generally to compare the growth patterns for the applied treatments. This paper discusses some of the methods of analysing repeated measures data and illustrates the use of cubic smoothing splines to describe irregular cattle growth data. Animal production for a consuming world : proceedings of 9th Congress of the Asian-Australasian Association of Animal Production Societies [AAAP] and 23rd Biennial Conference of the Australian Society of Animal Production [ASAP] and 17th Annual Symposium of the University of Sydney, Dairy Research Foundation, [DRF]. 2-7 July 2000, Sydney, Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spot measurements of methane emission rate (n = 18 700) by 24 Angus steers fed mixed rations from GrowSafe feeders were made over 3- to 6-min periods by a GreenFeed emission monitoring (GEM) unit. The data were analysed to estimate daily methane production (DMP; g/day) and derived methane yield (MY; g/kg dry matter intake (DMI)). A one-compartment dose model of spot emission rate v. time since the preceding meal was compared with the models of Wood (1967) and Dijkstra et al. (1997) and the average of spot measures. Fitted values for DMP were calculated from the area under the curves. Two methods of relating methane and feed intakes were then studied: the classical calculation of MY as DMP/DMI (kg/day); and a novel method of estimating DMP from time and size of preceding meals using either the data for only the two meals preceding a spot measurement, or all meals for 3 days prior. Two approaches were also used to estimate DMP from spot measurements: fitting of splines on a 'per-animal per-day' basis and an alternate approach of modelling DMP after each feed event by least squares (using Solver), summing (for each animal) the contributions from each feed event by best-fitting a one-compartment model. Time since the preceding meal was of limited value in estimating DMP. Even when the meal sizes and time intervals between a spot measurement and all feeding events in the previous 72 h were assessed, only 16.9% of the variance in spot emission rate measured by GEM was explained by this feeding information. While using the preceding meal alone gave a biased (underestimate) of DMP, allowing for a longer feed history removed this bias. A power analysis taking into account the sources of variation in DMP indicated that to obtain an estimate of DMP with a 95% confidence interval within 5% of the observed 64 days mean of spot measures would require 40 animals measured over 45 days (two spot measurements per day) or 30 animals measured over 55 days. These numbers suggest that spot measurements could be made in association with feed efficiency tests made over 70 days. Spot measurements of enteric emissions can be used to define DMP but the number of animals and samples are larger than are needed when day-long measures are made.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes the use of empirical modeling techniques for building microarchitecture sensitive models for compiler optimizations. The models we build relate program performance to settings of compiler optimization flags, associated heuristics and key microarchitectural parameters. Unlike traditional analytical modeling methods, this relationship is learned entirely from data obtained by measuring performance at a small number of carefully selected compiler/microarchitecture configurations. We evaluate three different learning techniques in this context viz. linear regression, adaptive regression splines and radial basis function networks. We use the generated models to a) predict program performance at arbitrary compiler/microarchitecture configurations, b) quantify the significance of complex interactions between optimizations and the microarchitecture, and c) efficiently search for'optimal' settings of optimization flags and heuristics for any given microarchitectural configuration. Our evaluation using benchmarks from the SPEC CPU2000 suits suggests that accurate models (< 5% average error in prediction) can be generated using a reasonable number of simulations. We also find that using compiler settings prescribed by a model-based search can improve program performance by as much as 19% (with an average of 9.5%) over highly optimized binaries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a family of 3D versions of a smooth finite element method (Sunilkumar and Roy 2010), wherein the globally smooth shape functions are derivable through the condition of polynomial reproduction with the tetrahedral B-splines (DMS-splines) or tensor-product forms of triangular B-splines and ID NURBS bases acting as the kernel functions. While the domain decomposition is accomplished through tetrahedral or triangular prism elements, an additional requirement here is an appropriate generation of knotclouds around the element vertices or corners. The possibility of sensitive dependence of numerical solutions to the placements of knotclouds is largely arrested by enforcing the condition of polynomial reproduction whilst deriving the shape functions. Nevertheless, given the higher complexity in forming the knotclouds for tetrahedral elements especially when higher demand is placed on the order of continuity of the shape functions across inter-element boundaries, we presently emphasize an exploration of the triangular prism based formulation in the context of several benchmark problems of interest in linear solid mechanics. In the absence of a more rigorous study on the convergence analyses, the numerical exercise, reported herein, helps establish the method as one of remarkable accuracy and robust performance against numerical ill-conditioning (such as locking of different kinds) vis-a-vis the conventional FEM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We address the problem of computing the level-crossings of an analog signal from samples measured on a uniform grid. Such a problem is important, for example, in multilevel analog-to-digital (A/D) converters. The first operation in such sampling modalities is a comparator, which gives rise to a bilevel waveform. Since bilevel signals are not bandlimited, measuring the level-crossing times exactly becomes impractical within the conventional framework of Shannon sampling. In this paper, we propose a novel sub-Nyquist sampling technique for making measurements on a uniform grid and thereby for exactly computing the level-crossing times from those samples. The computational complexity of the technique is low and comprises simple arithmetic operations. We also present a finite-rate-of-innovation sampling perspective of the proposed approach and also show how exponential splines fit in naturally into the proposed sampling framework. We also discuss some concrete practical applications of the sampling technique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper introduces a scheme for classification of online handwritten characters based on polynomial regression of the sampled points of the sub-strokes in a character. The segmentation is done based on the velocity profile of the written character and this requires a smoothening of the velocity profile. We propose a novel scheme for smoothening the velocity profile curve and identification of the critical points to segment the character. We also porpose another method for segmentation based on the human eye perception. We then extract two sets of features for recognition of handwritten characters. Each sub-stroke is a simple curve, a part of the character, and is represented by the distance measure of each point from the first point. This forms the first set of feature vector for each character. The second feature vector are the coeficients obtained from the B-splines fitted to the control knots obtained from the segmentation algorithm. The feature vector is fed to the SVM classifier and it indicates an efficiency of 68% using the polynomial regression technique and 74% using the spline fitting method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a new class of continuously defined parametric snakes using a special kind of exponential splines as basis functions. We have enforced our bases to have the shortestpossible support subject to some design constraints to maximize efficiency. While the resulting snakes are versatile enough to provide a good approximation of any closed curve in the plane, their most important feature is the fact that they admit ellipses within their span. Thus, they can perfectly generate circular and elliptical shapes. These features are appropriate to delineate cross sections of cylindrical-like conduits and to outline blob-like objects. We address the implementation details and illustrate the capabilities of our snake with synthetic and real data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a new class of continuously defined parametric snakes using a special kind of exponential splines as basis functions. We have enforced our bases to have the shortest possible support subject to some design constraints to maximize efficiency. While the resulting snakes are versatile enough to provide a good approximation of any closed curve in the plane, their most important feature is the fact that they admit ellipses within their span. Thus, they can perfectly generate circular and elliptical shapes. These features are appropriate to delineate cross sections of cylindrical-like conduits and to outline bloblike objects. We address the implementation details and illustrate the capabilities of our snake with synthetic and real data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, size dependent linear free flexural vibration behavior of functionally graded (FG) nanoplates are investigated using the iso-geometric based finite element method. The field variables are approximated by non-uniform rational B-splines. The nonlocal constitutive relation is based on Eringen's differential form of nonlocal elasticity theory. The material properties are assumed to vary only in the thickness direction and the effective properties for the FG plate are computed using Mori-Tanaka homogenization scheme. The accuracy of the present formulation is demonstrated considering the problems for which solutions are available. A detailed numerical study is carried out to examine the effect of material gradient index, the characteristic internal length, the plate thickness, the plate aspect ratio and the boundary conditions on the global response of the FG nanoplate. From the detailed numerical study it is seen that the fundamental frequency decreases with increasing gradient index and characteristic internal length. (c) 2012 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An iterative image reconstruction technique employing B-Spline potential function in a Bayesian framework is proposed for fluorescence microscopy images. B-splines are piecewise polynomials with smooth transition, compact support and are the shortest polynomial splines. Incorporation of the B-spline potential function in the maximum-a-posteriori reconstruction technique resulted in improved contrast, enhanced resolution and substantial background reduction. The proposed technique is validated on simulated data as well as on the images acquired from fluorescence microscopes (widefield, confocal laser scanning fluorescence and super-resolution 4Pi microscopy). A comparative study of the proposed technique with the state-of-art maximum likelihood (ML) and maximum-a-posteriori (MAP) with quadratic potential function shows its superiority over the others. B-Spline MAP technique can find applications in several imaging modalities of fluorescence microscopy like selective plane illumination microscopy, localization microscopy and STED. (C) 2015 Author(s).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In gross motion of flexible one-dimensional (1D) objects such as cables, ropes, chains, ribbons and hair, the assumption of constant length is realistic and reasonable. The motion of the object also appears more natural if the motion or disturbance given at one end attenuates along the length of the object. In an earlier work, variational calculus was used to derive natural and length-preserving transformation of planar and spatial curves and implemented for flexible 1D objects discretized with a large number of straight segments. This paper proposes a novel idea to reduce computational effort and enable real-time and realistic simulation of the motion of flexible 1D objects. The key idea is to represent the flexible 1D object as a spline and move the underlying control polygon with much smaller number of segments. To preserve the length of the curve to within a prescribed tolerance as the control polygon is moved, the control polygon is adaptively modified by subdivision and merging. New theoretical results relating the length of the curve and the angle between the adjacent segments of the control polygon are derived for quadratic and cubic splines. Depending on the prescribed tolerance on length error, the theoretical results are used to obtain threshold angles for subdivision and merging. Simulation results for arbitrarily chosen planar and spatial curves whose one end is subjected to generic input motions are provided to illustrate the approach. (C) 2016 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[ES]El objetivo principal del presente Trabajo Fin de Grado es diseñar un interpolador de trayectorias y programarlo en Labview. Para ello, se ha de analizar primeramente la cinemática del mecanismo a utilizar, un robot de cinemática paralela 5R, y calcular su espacio de trabajo. Después, se deducirán y programarán diversos perfiles de velocidades (trapezoidal de velocidades, trapezoidal de aceleraciones y sinusoidal) para moverse en rectas, así como el movimiento en curvas mediante splines. También se hallarán experimentalmente las características de los motores disponibles y se averiguarán las velocidades máximas que puede alcanzar el mecanismo. Así podremos presentar un software que sirva para generar trayectorias para el robot 5R. Se presentan también, entre otros, el presupuesto del proyecto y los riesgos en los que se puede incurrir. El documento finaliza con unos anexos de planos CAD, resultados y código de programación.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation is concerned with the development of a new discrete element method (DEM) based on Non-Uniform Rational Basis Splines (NURBS). With NURBS, the new DEM is able to capture sphericity and angularity, the two particle morphological measures used in characterizing real grain geometries. By taking advantage of the parametric nature of NURBS, the Lipschitzian dividing rectangle (DIRECT) global optimization procedure is employed as a solution procedure to the closest-point projection problem, which enables the contact treatment of non-convex particles. A contact dynamics (CD) approach to the NURBS-based discrete method is also formulated. By combining particle shape flexibility, properties of implicit time-integration, and non-penetrating constraints, we target applications in which the classical DEM either performs poorly or simply fails, i.e., in granular systems composed of rigid or highly stiff angular particles and subjected to quasistatic or dynamic flow conditions. The CD implementation is made simple by adopting a variational framework, which enables the resulting discrete problem to be readily solved using off-the-shelf mathematical programming solvers. The capabilities of the NURBS-based DEM are demonstrated through 2D numerical examples that highlight the effects of particle morphology on the macroscopic response of granular assemblies under quasistatic and dynamic flow conditions, and a 3D characterization of material response in the shear band of a real triaxial specimen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Esta dissertação apresenta um aperfeiçoamento para o Sistema de Imagens Tridimensional Híbrido (SITH) que é utilizado para obtenção de uma superfície tridimensional do relevo de uma determinada região a partir de dois aerofotogramas consecutivos da mesma. A fotogrametria é a ciência e tecnologia utilizada para obter informações confiáveis a partir de imagens adquiridas por sensores. O aperfeiçoamento do SITH consistirá na automatização da obtenção dos pontos através da técnica de Transformada de Características Invariantes a Escala (SIFT - Scale Invariant Feature Transform) dos pares de imagens estereoscópicas obtidos por câmeras aéreas métricas, e na utilização de técnicas de interpolação por splines cúbicos para suavização das superfícies tridimensionais obtidas pelo mesmo, proporcionando uma visualização mais clara dos detalhes da área estudada e auxiliando em prevenções contra deslizamentos em locais de risco a partir de um planejamento urbano adequado. Os resultados computacionais mostram que a incorporação destes métodos ao programa SITH apresentaram bons resultados.