920 resultados para kernel estimators


Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Kernel brown centres in macadamia are a defect causing internal discolouration of kernels. This study investigates the effect on the incidence of brown centres in raw kernel after maintaining high moisture content in macadamia nuts-in-shell stored at temperatures of 30°C, 35°C, 40°C and 45°C. RESULTS Brown centres of raw kernel increased with nuts-in-shell storage time and temperature when high moisture content was maintained by sealing in polyethylene bags. Almost all kernels developed the defect when kept at high moisture content for 5 days at 45°C, and 44% developed brown centres after only 2 days of storage at high moisture content at 45°C. This contrasted with only 0.76% when stored for 2 days at 45°C but allowed to dry in open-mesh bags. At storage temperatures below 45°C, there were fewer brown centres, but there were still significant differences between those stored at high moisture content and those allowed to dry (P < 0.05). CONCLUSION Maintenance of high moisture content during macadamia nuts-in-shell storage increases the incidence of brown centres in raw kernels and the defect increases with time and temperature. On-farm nuts-in-shell drying and storage practices should rapidly remove moisture to reduce losses. Ideally, nuts-in-shell should not be stored at high moisture content on-farm at temperatures over 30°C. © 2013 Society of Chemical Industry

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Terrain traversability estimation is a fundamental requirement to ensure the safety of autonomous planetary rovers and their ability to conduct long-term missions. This paper addresses two fundamental challenges for terrain traversability estimation techniques. First, representations of terrain data, which are typically built by the rover’s onboard exteroceptive sensors, are often incomplete due to occlusions and sensor limitations. Second, during terrain traversal, the rover-terrain interaction can cause terrain deformation, which may significantly alter the difficulty of traversal. We propose a novel approach built on Gaussian process (GP) regression to learn, and consequently to predict, the rover’s attitude and chassis configuration on unstructured terrain using terrain geometry information only. First, given incomplete terrain data, we make an initial prediction under the assumption that the terrain is rigid, using a learnt kernel function. Then, we refine this initial estimate to account for the effects of potential terrain deformation, using a near-to-far learning approach based on multitask GP regression. We present an extensive experimental validation of the proposed approach on terrain that is mostly rocky and whose geometry changes as a result of loads from rover traversals. This demonstrates the ability of the proposed approach to accurately predict the rover’s attitude and configuration in partially occluded and deformable terrain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Age-related macular degeneration (AMD) is the leading cause of blindness in the developed world. Increasing dietary intake of lutein- and zeaxanthin-rich foods is a potential means of preventing, or at least slowing the progression of AMD. Zeaxanthin levels in tropical super-sweetcorn was increased from 1.1 to 11.9 µg/g FW through conventional breeding and selection, associated with both an increase in the proportion of zeaxanthin relative to other carotenoids, and a general increase in carotenoid synthesis. Increasing zeaxanthin was associated with a colour shift from traditional ‘canary-yellow’ kernels to a golden-orange colour. Kernel colour was most closely correlated (r2=69%) with an increase in beta-arm carotenoid concentration. Consumer analysis revealed that prior to any knowledge of zeaxanthin-related health benefit, consumers would readily purchase both yellow and gold cobs. Once the health benefit was explained, this extended to deep-gold cobs. Colour difference between regular yellow sweetcorn and high-zeaxanthin sweetcorn could potentially be used as a visual means of differentiating high-zeaxanthin sweetcorn in the marketplace.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The effects of plant growth conditions on concentrations of proteins, including allergens, in peanut (Arachis hypogaea L.) kernels are largely unknown. Peanuts (cv. Walter) were grown at five sites (Taabinga, Redvale, Childers, Bundaberg, and Kairi) covering three commercial growing regions in Queensland, Australia. Differences in temperature, rainfall, and solar radiation during the growing season were evaluated. Kernel yield varied from 2.3 t/ha (Kairi) to 3.9 t/ha (Childers), probably due to differences in solar radiation. Crude protein appeared to vary only between Kairi and Childers, whereas Ara h 1 and 2 concentrations were similar in all locations. 2D-DIGE revealed significant differences in spot volumes for only two minor protein spots from peanuts grown in the five locations. Western blotting using peanut-allergic serum revealed no qualitative differences in recognition of antigens. It was concluded that peanuts grown in different growing regions in Queensland, Australia, had similar protein compositions and therefore were unlikely to show differences in allergenicity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Zeaxanthin, along with its isomer lutein, are the major carotenoids contributing to the characteristic colour of yellow sweet-corn. From a human health perspective, these two carotenoids are also specifically accumulated in the human macula, and are thought to protect the photoreceptor cells of the eye from blue light oxidative damage and to improve visual acuity. As humans cannot synthesise these compounds, they must be accumulated from dietary components containing zeaxanthin and lutein. In comparison to most dietary sources, yellow sweet-corn (Zea mays var. rugosa) is a particularly good source of zeaxanthin, although the concentration of zeaxanthin is still fairly low in comparison to what is considered a supplementary dose to improve macular pigment concentration (2 mg/person/day). In our present project, we have increased zeaxanthin concentration in sweet-corn kernels from 0.2 to 0.3 mg/100 g FW to greater than 2.0 mg/100 g FW at sweet-corn eating-stage, substantially reducing the amount of corn required to provide the same dosage of zeaxanthin. This was achieved by altering the carotenoid synthesis pathway to more than double total carotenoid synthesis and to redirect carotenoid synthesis towards the beta-arm of the pathway where zeaxanthin is synthesised. This resulted in a proportional increase of zeaxanthin from 22% to 70% of the total carotenoid present. As kernels increase in physiological maturity, carotenoid concentration also significantly increases, mainly due to increased synthesis but also due to a decline in moisture content of the kernels. When fully mature, dried kernels can reach zeaxanthin and carotene concentrations of 8.7 mg/100 g and 2.6 mg/100 g, respectively. Although kernels continue to increase in zeaxanthin when harvested past their normal harvest maturity stage, the texture of these 'over-mature' kernels is tough, making them less appealing for fresh consumption. Increase in zeaxanthin concentration and other orange carotenoids such as p-carotene also results in a decline in kernel hue angle of fresh sweet-corn from approximately 90 (yellow) to as low as 75 (orange-yellow). This enables high-zeaxanthin sweet-corn to be visually-distinguishable from standard yellow sweet-corn, which is predominantly pigmented by lutein.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The monograph dissertation deals with kernel integral operators and their mapping properties on Euclidean domains. The associated kernels are weakly singular and examples of such are given by Green functions of certain elliptic partial differential equations. It is well known that mapping properties of the corresponding Green operators can be used to deduce a priori estimates for the solutions of these equations. In the dissertation, natural size- and cancellation conditions are quantified for kernels defined in domains. These kernels induce integral operators which are then composed with any partial differential operator of prescribed order, depending on the size of the kernel. The main object of study in this dissertation being the boundedness properties of such compositions, the main result is the characterization of their Lp-boundedness on suitably regular domains. In case the aforementioned kernels are defined in the whole Euclidean space, their partial derivatives of prescribed order turn out to be so called standard kernels that arise in connection with singular integral operators. The Lp-boundedness of singular integrals is characterized by the T1 theorem, which is originally due to David and Journé and was published in 1984 (Ann. of Math. 120). The main result in the dissertation can be interpreted as a T1 theorem for weakly singular integral operators. The dissertation deals also with special convolution type weakly singular integral operators that are defined on Euclidean spaces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The phosphine distribution in a cylindrical silo containing grain is predicted. A three-dimensional mathematical model, which accounts for multicomponent gas phase transport and the sorption of phosphine into the grain kernel is developed. In addition, a simple model is presented to describe the death of insects within the grain as a function of their exposure to phosphine gas. The proposed model is solved using the commercially available computational fluid dynamics (CFD) software, FLUENT, together with our own C code to customize the solver in order to incorporate the models for sorption and insect extinction. Two types of fumigation delivery are studied, namely, fan- forced from the base of the silo and tablet from the top of the silo. An analysis of the predicted phosphine distribution shows that during fan forced fumigation, the position of the leaky area is very important to the development of the gas flow field and the phosphine distribution in the silo. If the leak is in the lower section of the silo, insects that exist near the top of the silo may not be eradicated. However, the position of a leak does not affect phosphine distribution during tablet fumigation. For such fumigation in a typical silo configuration, phosphine concentrations remain low near the base of the silo. Furthermore, we find that half-life pressure test readings are not an indicator of phosphine distribution during tablet fumigation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cyclostationary analysis has proven effective in identifying signal components for diagnostic purposes. A key descriptor in this framework is the cyclic power spectrum, traditionally estimated by the averaged cyclic periodogram and the smoothed cyclic periodogram. A lengthy debate about the best estimator finally found a solution in a cornerstone work by Antoni, who proposed a unified form for the two families, thus allowing a detailed statistical study of their properties. Since then, the focus of cyclostationary research has shifted towards algorithms, in terms of computational efficiency and simplicity of implementation. Traditional algorithms have proven computationally inefficient and the sophisticated "cyclostationary" definition of these estimators slowed their spread in the industry. The only attempt to increase the computational efficiency of cyclostationary estimators is represented by the cyclic modulation spectrum. This indicator exploits the relationship between cyclostationarity and envelope analysis. The link with envelope analysis allows a leap in computational efficiency and provides a "way in" for the understanding by industrial engineers. However, the new estimator lies outside the unified form described above and an unbiased version of the indicator has not been proposed. This paper will therefore extend the analysis of envelope-based estimators of the cyclic spectrum, proposing a new approach to include them in the unified form of cyclostationary estimators. This will enable the definition of a new envelope-based algorithm and the detailed analysis of the properties of the cyclic modulation spectrum. The computational efficiency of envelope-based algorithms will be also discussed quantitatively for the first time in comparison with the averaged cyclic periodogram. Finally, the algorithms will be validated with numerical and experimental examples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Emerging embedded applications are based on evolving standards (e.g., MPEG2/4, H.264/265, IEEE802.11a/b/g/n). Since most of these applications run on handheld devices, there is an increasing need for a single chip solution that can dynamically interoperate between different standards and their derivatives. In order to achieve high resource utilization and low power dissipation, we propose REDEFINE, a polymorphic ASIC in which specialized hardware units are replaced with basic hardware units that can create the same functionality by runtime re-composition. It is a ``future-proof'' custom hardware solution for multiple applications and their derivatives in a domain. In this article, we describe a compiler framework and supporting hardware comprising compute, storage, and communication resources. Applications described in high-level language (e.g., C) are compiled into application substructures. For each application substructure, a set of compute elements on the hardware are interconnected during runtime to form a pattern that closely matches the communication pattern of that particular application. The advantage is that the bounded CEs are neither processor cores nor logic elements as in FPGAs. Hence, REDEFINE offers the power and performance advantage of an ASIC and the hardware reconfigurability and programmability of that of an FPGA/instruction set processor. In addition, the hardware supports custom instruction pipelining. Existing instruction-set extensible processors determine a sequence of instructions that repeatedly occur within the application to create custom instructions at design time to speed up the execution of this sequence. We extend this scheme further, where a kernel is compiled into custom instructions that bear strong producer-consumer relationship (and not limited to frequently occurring sequences of instructions). Custom instructions, realized as hardware compositions effected at runtime, allow several instances of the same to be active in parallel. A key distinguishing factor in majority of the emerging embedded applications is stream processing. To reduce the overheads of data transfer between custom instructions, direct communication paths are employed among custom instructions. In this article, we present the overview of the hardware-aware compiler framework, which determines the NoC-aware schedule of transports of the data exchanged between the custom instructions on the interconnect. The results for the FFT kernel indicate a 25% reduction in the number of loads/stores, and throughput improves by log(n) for n-point FFT when compared to sequential implementation. Overall, REDEFINE offers flexibility and a runtime reconfigurability at the expense of 1.16x in power and 8x in area when compared to an ASIC. REDEFINE implementation consumes 0.1x the power of an FPGA implementation. In addition, the configuration overhead of the FPGA implementation is 1,000x more than that of REDEFINE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of unsupervised anomaly detection arises in a wide variety of practical applications. While one-class support vector machines have demonstrated their effectiveness as an anomaly detection technique, their ability to model large datasets is limited due to their memory and time complexity for training. To address this issue for supervised learning of kernel machines, there has been growing interest in random projection methods as an alternative to the computationally expensive problems of kernel matrix construction and sup-port vector optimisation. In this paper we leverage the theory of nonlinear random projections and propose the Randomised One-class SVM (R1SVM), which is an efficient and scalable anomaly detection technique that can be trained on large-scale datasets. Our empirical analysis on several real-life and synthetic datasets shows that our randomised 1SVM algorithm achieves comparable or better accuracy to deep auto encoder and traditional kernelised approaches for anomaly detection, while being approximately 100 times faster in training and testing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many conventional statistical machine learning al- gorithms generalise poorly if distribution bias ex- ists in the datasets. For example, distribution bias arises in the context of domain generalisation, where knowledge acquired from multiple source domains need to be used in a previously unseen target domains. We propose Elliptical Summary Randomisation (ESRand), an efficient domain generalisation approach that comprises of a randomised kernel and elliptical data summarisation. ESRand learns a domain interdependent projection to a la- tent subspace that minimises the existing biases to the data while maintaining the functional relationship between domains. In the latent subspace, ellipsoidal summaries replace the samples to enhance the generalisation by further removing bias and noise in the data. Moreover, the summarisation enables large-scale data processing by significantly reducing the size of the data. Through comprehensive analysis, we show that our subspace-based approach outperforms state-of-the-art results on several activity recognition benchmark datasets, while keeping the computational complexity significantly low.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper compares closed-loop performance of seeker-based and radar-based estimators for surface-to-air interception through 6-degree-of-freedom simulation using proportional navigation guidance.Ground radar measurements are evader range, azimuth and elevation angles contaminated by Gaussian noise. Onboard seeker measurements are pursuer-evader relative range, range rate also contaminated by Gaussian noise. The gimbal angles and line-of-sight rates in the gimbal frame,contaminated by time-correlated non-Gaussian noise with realistic numerical values are also available as measurements. In both the applications, extended Kalman filter with Gaussian noise assumption are used for state estimation. For a typical engagement, it is found that,based on Monte Carlo studies, seeker estimator outperforms radar estimator in terms of autopilot demand and reduces the miss distance.Thus, a seeker estimator with white Gaussian assumption is found to be adequate to handle the measurements even in the presence of non-Gaussian correlated noise. This paper uses realistic numerical values of all noise parameters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We derive a very general expression of the survival probability and the first passage time distribution for a particle executing Brownian motion in full phase space with an absorbing boundary condition at a point in the position space, which is valid irrespective of the statistical nature of the dynamics. The expression, together with the Jensen's inequality, naturally leads to a lower bound to the actual survival probability and an approximate first passage time distribution. These are expressed in terms of the position-position, velocity-velocity, and position-velocity variances. Knowledge of these variances enables one to compute a lower bound to the survival probability and consequently the first passage distribution function. As examples, we compute these for a Gaussian Markovian process and, in the case of non-Markovian process, with an exponentially decaying friction kernel and also with a power law friction kernel. Our analysis shows that the survival probability decays exponentially at the long time irrespective of the nature of the dynamics with an exponent equal to the transition state rate constant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dynamic systems involving convolution integrals with decaying kernels, of which fractionally damped systems form a special case, are non-local in time and hence infinite dimensional. Straightforward numerical solution of such systems up to time t needs O(t(2)) computations owing to the repeated evaluation of integrals over intervals that grow like t. Finite-dimensional and local approximations are thus desirable. We present here an approximation method which first rewrites the evolution equation as a coupled in finite-dimensional system with no convolution, and then uses Galerkin approximation with finite elements to obtain linear, finite-dimensional, constant coefficient approximations for the convolution. This paper is a broad generalization, based on a new insight, of our prior work with fractional order derivatives (Singh & Chatterjee 2006 Nonlinear Dyn. 45, 183-206). In particular, the decaying kernels we can address are now generalized to the Laplace transforms of known functions; of these, the power law kernel of fractional order differentiation is a special case. The approximation can be refined easily. The local nature of the approximation allows numerical solution up to time t with O(t) computations. Examples with several different kernels show excellent performance. A key feature of our approach is that the dynamic system in which the convolution integral appears is itself approximated using another system, as distinct from numerically approximating just the solution for the given initial values; this allows non-standard uses of the approximation, e. g. in stability analyses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we propose a novel family of kernels for multivariate time-series classification problems. Each time-series is approximated by a linear combination of piecewise polynomial functions in a Reproducing Kernel Hilbert Space by a novel kernel interpolation technique. Using the associated kernel function a large margin classification formulation is proposed which can discriminate between two classes. The formulation leads to kernels, between two multivariate time-series, which can be efficiently computed. The kernels have been successfully applied to writer independent handwritten character recognition.