910 resultados para wot,iot,iot-system,digital-twin,framework,least-squares


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes and implements a new methodology for forecasting time series, based on bicorrelations and cross-bicorrelations. It is shown that the forecasting technique arises as a natural extension of, and as a complement to, existing univariate and multivariate non-linearity tests. The formulations are essentially modified autoregressive or vector autoregressive models respectively, which can be estimated using ordinary least squares. The techniques are applied to a set of high-frequency exchange rate returns, and their out-of-sample forecasting performance is compared to that of other time series models

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radar refractivity retrievals have the potential to accurately capture near-surface humidity fields from the phase change of ground clutter returns. In practice, phase changes are very noisy and the required smoothing will diminish large radial phase change gradients, leading to severe underestimates of large refractivity changes (ΔN). To mitigate this, the mean refractivity change over the field (ΔNfield) must be subtracted prior to smoothing. However, both observations and simulations indicate that highly correlated returns (e.g., when single targets straddle neighboring gates) result in underestimates of ΔNfield when pulse-pair processing is used. This may contribute to reported differences of up to 30 N units between surface observations and retrievals. This effect can be avoided if ΔNfield is estimated using a linear least squares fit to azimuthally averaged phase changes. Nevertheless, subsequent smoothing of the phase changes will still tend to diminish the all-important spatial perturbations in retrieved refractivity relative to ΔNfield; an iterative estimation approach may be required. The uncertainty in the target location within the range gate leads to additional phase noise proportional to ΔN, pulse length, and radar frequency. The use of short pulse lengths is recommended, not only to reduce this noise but to increase both the maximum detectable refractivity change and the number of suitable targets. Retrievals of refractivity fields must allow for large ΔN relative to an earlier reference field. This should be achievable for short pulses at S band, but phase noise due to target motion may prevent this at C band, while at X band even the retrieval of ΔN over shorter periods may at times be impossible.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a new class of neurofuzzy construction algorithms with the aim of maximizing generalization capability specifically for imbalanced data classification problems based on leave-one-out (LOO) cross validation. The algorithms are in two stages, first an initial rule base is constructed based on estimating the Gaussian mixture model with analysis of variance decomposition from input data; the second stage carries out the joint weighted least squares parameter estimation and rule selection using orthogonal forward subspace selection (OFSS)procedure. We show how different LOO based rule selection criteria can be incorporated with OFSS, and advocate either maximizing the leave-one-out area under curve of the receiver operating characteristics, or maximizing the leave-one-out Fmeasure if the data sets exhibit imbalanced class distribution. Extensive comparative simulations illustrate the effectiveness of the proposed algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is widely acknowledged that innovation is one of the pillars of multinational enterprises (MNEs) and that technological knowledge from different host locations is a key factor to the MNEs’ competitive advantages development. Concerning these assumptions, in this paper we aim to understand how the social and the relational contexts affect the conventional and reverse transfer of innovation from MNEs’ subsidiaries hosted in emerging markets. We analyzed the social context through the institutional profile (CIP) level and the relational context through trust and integration levels utilizing a survey sent to 172 foreign subsidiaries located in Brazil, as well as secondary data. Through an ordinary least squares regression (OLS) analysis we found that the relational context affects the conventional and reverse innovation transfer in subsidiaries hosted in emerging markets. We however did not find support for the social context effect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – This paper aims to address the gaps in service recovery strategy assessment. An effective service recovery strategy that prevents customer defection after a service failure is a powerful managerial instrument. The literature to date does not present a comprehensive assessment of service recovery strategy. It also lacks a clear picture of the service recovery actions at managers’ disposal in case of failure and the effectiveness of individual strategies on customer outcomes. Design/methodology/approach – Based on service recovery theory, this paper proposes a formative index of service recovery strategy and empirically validates this measure using partial least-squares path modelling with survey data from 437 complainants in the telecommunications industry in Egypt. Findings – The CURE scale (CUstomer REcovery scale) presents evidence of reliability as well as convergent, discriminant and nomological validity. Findings also reveal that problem-solving, speed of response, effort, facilitation and apology are the actions that have an impact on the customer’s satisfaction with service recovery. Practical implications – This new formative index is of potential value in investigating links between strategy and customer evaluations of service by helping managers identify which actions contribute most to changes in the overall service recovery strategy as well as satisfaction with service recovery. Ultimately, the CURE scale facilitates the long-term planning of effective complaint management. Originality/value – This is the first study in the service marketing literature to propose a comprehensive assessment of service recovery strategy and clearly identify the service recovery actions that contribute most to changes in the overall service recovery strategy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to investigate the effects of numerous milk compositional factors on milk coagulation properties using Partial Least Squares (PLS). Milk from herds of Jersey and Holstein- Friesian cattle was collected across the year and blended (n=55), to maximise variation in composition and coagulation. The milk was analysed for casein, protein, fat, titratable acidity, lactose, Ca2+, urea content, micelles size, fat globule size, somatic cell count and pH. Milk coagulation properties were defined as coagulation time, curd firmness and curd firmness rate measured by a controlled strain rheometer. The models derived from PLS had higher predictive power than previous models demonstrating the value of measuring more milk components. In addition to the well-established relationships with casein and protein levels, CMS and fat globule size were found to have as strong impact on all of the three models. The study also found a positive impact of fat on milk coagulation properties and a strong relationship between lactose and curd firmness, and urea and curd firmness rate, all of which warrant further investigation due to current lack of knowledge of the underlying mechanism. These findings demonstrate the importance of using a wider range of milk compositional variables for the prediction of the milk coagulation properties, and hence as indicators of milk suitability for cheese making.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

4-Dimensional Variational Data Assimilation (4DVAR) assimilates observations through the minimisation of a least-squares objective function, which is constrained by the model flow. We refer to 4DVAR as strong-constraint 4DVAR (sc4DVAR) in this thesis as it assumes the model is perfect. Relaxing this assumption gives rise to weak-constraint 4DVAR (wc4DVAR), leading to a different minimisation problem with more degrees of freedom. We consider two wc4DVAR formulations in this thesis, the model error formulation and state estimation formulation. The 4DVAR objective function is traditionally solved using gradient-based iterative methods. The principle method used in Numerical Weather Prediction today is the Gauss-Newton approach. This method introduces a linearised `inner-loop' objective function, which upon convergence, updates the solution of the non-linear `outer-loop' objective function. This requires many evaluations of the objective function and its gradient, which emphasises the importance of the Hessian. The eigenvalues and eigenvectors of the Hessian provide insight into the degree of convexity of the objective function, while also indicating the difficulty one may encounter while iterative solving 4DVAR. The condition number of the Hessian is an appropriate measure for the sensitivity of the problem to input data. The condition number can also indicate the rate of convergence and solution accuracy of the minimisation algorithm. This thesis investigates the sensitivity of the solution process minimising both wc4DVAR objective functions to the internal assimilation parameters composing the problem. We gain insight into these sensitivities by bounding the condition number of the Hessians of both objective functions. We also precondition the model error objective function and show improved convergence. We show that both formulations' sensitivities are related to error variance balance, assimilation window length and correlation length-scales using the bounds. We further demonstrate this through numerical experiments on the condition number and data assimilation experiments using linear and non-linear chaotic toy models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We use sunspot group observations from the Royal Greenwich Observatory (RGO) to investigate the effects of intercalibrating data from observers with different visual acuities. The tests are made by counting the number of groups RB above a variable cut-off threshold of observed total whole-spot area (uncorrected for foreshortening) to simulate what a lower acuity observer would have seen. The synthesised annual means of RB are then re-scaled to the full observed RGO group number RA using a variety of regression techniques. It is found that a very high correlation between RA and RB (rAB > 0.98) does not prevent large errors in the intercalibration (for example sunspot maximum values can be over 30 % too large even for such levels of rAB). In generating the backbone sunspot number (RBB), Svalgaard and Schatten (2015, this issue) force regression fits to pass through the scatter plot origin which generates unreliable fits (the residuals do not form a normal distribution) and causes sunspot cycle amplitudes to be exaggerated in the intercalibrated data. It is demonstrated that the use of Quantile-Quantile (“Q  Q”) plots to test for a normal distribution is a useful indicator of erroneous and misleading regression fits. Ordinary least squares linear fits, not forced to pass through the origin, are sometimes reliable (although the optimum method used is shown to be different when matching peak and average sunspot group numbers). However, other fits are only reliable if non-linear regression is used. From these results it is entirely possible that the inflation of solar cycle amplitudes in the backbone group sunspot number as one goes back in time, relative to related solar-terrestrial parameters, is entirely caused by the use of inappropriate and non-robust regression techniques to calibrate the sunspot data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present models for the upper-mantle velocity structure beneath SE and Central Brazil using independent tomographic inversions of P- and S-wave relative arrival-time residuals (including core phases) from teleseismic earthquakes. The events were recorded by a total of 92 stations deployed through different projects, institutions and time periods during the years 1992-2004. Our results show correlations with the main tectonic structures and reveal new anomalies not yet observed in previous works. All interpretations are based on robust anomalies, which appear in the different inversions for P-and S-waves. The resolution is variable through our study volume and has been analyzed through different theoretical test inversions. High-velocity anomalies are observed in the western portion of the Sao Francisco Craton, supporting the hypothesis that this Craton was part of a major Neoproterozoic plate (San Franciscan Plate). Low-velocity anomalies beneath the Tocantins Province (mainly fold belts between the Amazon and Sao Francisco Cratons) are interpreted as due to lithospheric thinning, which is consistent with the good correlation between intraplate seismicity and low-velocity anomalies in this region. Our results show that the basement of the Parana Basin is formed by several blocks, separated by suture zones, according to model of Milani & Ramos. The slab of the Nazca Plate can be observed as a high-velocity anomaly beneath the Parana Basin, between the depths of 700 and 1200 km. Further, we confirm the low-velocity anomaly in the NE area of the Parana Basin which has been interpreted by VanDecar et al. as a fossil conduct of the Tristan da Cunha Plume related to the Parana flood basalt eruptions during the opening of the South Atlantic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sea surface gradients derived from the Geosat and ERS-1 satellite altimetry geodetic missions were integrated with marine gravity data from the National Geophysical Data Center and Brazilian national surveys. Using the least squares collocation method, models of free-air gravity anomaly and geoid height were calculated for the coast of Brazil with a resolution of 2` x 2`. The integration of satellite and shipborne data showed better statistical results in regions near the coast than using satellite data only, suggesting an improvement when compared to the state-of-the-art global gravity models. Furthermore, these results were obtained with considerably less input information than was used by those reference models. The least squares collocation presented a very low content of high-frequency noise in the predicted gravity anomalies. This may be considered essential to improve the high resolution representation of the gravity field in regions of ocean-continent transition. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Roads and topography can determine patterns of land use and distribution of forest cover, particularly in tropical regions. We evaluated how road density, land use, and topography affected forest fragmentation, deforestation and forest regrowth in a Brazilian Atlantic Forest region near the city of Sao Paulo. We mapped roads and land use/land cover for three years (1962, 1981 and 2000) from historical aerial photographs, and summarized the distribution of roads, land use/land cover and topography within a grid of 94 non-overlapping 100 ha squares. We used generalized least squares regression models for data analysis. Our models showed that forest fragmentation and deforestation depended on topography, land use and road density, whereas forest regrowth depended primarily on land use. However, the relationships between these variables and forest dynamics changed in the two studied periods; land use and slope were the strongest predictors from 1962 to 1981, and past (1962) road density and land use were the strongest predictors for the following period (1981-2000). Roads had the strongest relationship with deforestation and forest fragmentation when the expansions of agriculture and buildings were limited to already deforested areas, and when there was a rapid expansion of development, under influence of Sao Paulo city. Furthermore, the past(1962)road network was more important than the recent road network (1981) when explaining forest dynamics between 1981 and 2000, suggesting a long-term effect of roads. Roads are permanent scars on the landscape and facilitate deforestation and forest fragmentation due to increased accessibility and land valorization, which control land-use and land-cover dynamics. Topography directly affected deforestation, agriculture and road expansion, mainly between 1962 and 1981. Forest are thus in peril where there are more roads, and long-term conservation strategies should consider ways to mitigate roads as permanent landscape features and drivers facilitators of deforestation and forest fragmentation. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we deal with robust inference in heteroscedastic measurement error models Rather than the normal distribution we postulate a Student t distribution for the observed variables Maximum likelihood estimates are computed numerically Consistent estimation of the asymptotic covariance matrices of the maximum likelihood and generalized least squares estimators is also discussed Three test statistics are proposed for testing hypotheses of interest with the asymptotic chi-square distribution which guarantees correct asymptotic significance levels Results of simulations and an application to a real data set are also reported (C) 2009 The Korean Statistical Society Published by Elsevier B V All rights reserved

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Moving-least-squares (MLS) surfaces undergoing large deformations need periodic regeneration of the point set (point-set resampling) so as to keep the point-set density quasi-uniform. Previous work by the authors dealt with algebraic MLS surfaces, and proposed a resampling strategy based on defining the new points at the intersections of the MLS surface with a suitable set of rays. That strategy has very low memory requirements and is easy to parallelize. In this article new resampling strategies with reduced CPU-time cost are explored. The basic idea is to choose as set of rays the lines of a regular, Cartesian grid, and to fully exploit this grid: as data structure for search queries, as spatial structure for traversing the surface in a continuation-like algorithm, and also as approximation grid for an interpolated version of the MLS surface. It is shown that in this way a very simple and compact resampling technique is obtained, which cuts the resampling cost by half with affordable memory requirements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Partition of Unity Implicits (PUI) has been recently introduced for surface reconstruction from point clouds. In this work, we propose a PUI method that employs a set of well-observed solutions in order to produce geometrically pleasant results without requiring time consuming or mathematically overloaded computations. One feature of our technique is the use of multivariate orthogonal polynomials in the least-squares approximation, which allows the recursive refinement of the local fittings in terms of the degree of the polynomial. However, since the use of high-order approximations based only on the number of available points is not reliable, we introduce the concept of coverage domain. In addition, the method relies on the use of an algebraically defined triangulation to handle two important tasks in PUI: the spatial decomposition and an adaptive polygonization. As the spatial subdivision is based on tetrahedra, the generated mesh may present poorly-shaped triangles that are improved in this work by means a specific vertex displacement technique. Furthermore, we also address sharp features and raw data treatment. A further contribution is based on the PUI locality property that leads to an intuitive scheme for improving or repairing the surface by means of editing local functions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mebendazole (MBZ) is a common benzimidazole anthelmintic that exists in three different polymorphic forms, A, B, and C. Polymorph C is the pharmaceutically preferred form due to its adequated aqueous solubility. No single crystal structure determinations depicting the nature of the crystal packing and molecular conformation and geometry have been performed on this compound. The crystal structure of mebendazole form C is resolved for the first time. Mebendazole form C crystallizes in the triclinic centrosymmetric space group and this drug is practically planar, since the least-squares methyl benzimidazolylcarbamate plane is much fitted on the forming atoms. However, the benzoyl group is twisted by 31(1)degrees from the benzimidazole ring, likewise the torsional angle between the benzene and carbonyl moieties is 27(1)degrees. The formerly described bends and other interesting intramolecular geometry features were viewed as consequence of the intermolecular contacts occurring within mebendazole C structure. Among these features, a conjugation decreasing through the imine nitrogen atom of the benzimidazole core and a further resonance path crossing the carbamate one were described. At last, the X-ray powder diffractogram of a form C rich mebendazole mixture was overlaid to the calculated one with the mebendazole crystal structure. (C) 2008 Wiley-Liss, Inc. and the American Pharmacists Association J Pharm Sci 98:2336-2344, 2009