930 resultados para Irreducible polynomial


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The relationship between temperature and mortality is non-linear and the effect estimates depend on the threshold temperatures selected. However, little is known about whether threshold temperatures differ with age or cause of deaths in the Southern Hemisphere. We conducted polynomial distributed lag non-linear models to assess the threshold temperatures for mortality from all ages (Dall), aged from 15 to 64 (D15-64), 65- 84(D65-84), ≥85 years (D85+), respiratory (RD) and cardiovascular diseases (CVD) in Brisbane, Australia, 1996–2004. We examined both hot and cold thresholds, and the lags of up to 15 days for cold effects and 3 days for hot effects. Results show that for the current day, the cold threshold was 20°C and the hot threshold was 28°C for the groups of Dall, D15-64 and D85+. The cold threshold was higher (23°C) for the group of D65-84 and lower (21°C) for the group of CVD. The hot threshold was higher (29°C) for the group of D65-84 and lower (27°C) for the group of RD. Compared to the current day, for the cold effects of up to 15-day lags, the threshold was lower for the group of D15-64, and the thresholds were higher for the groups of D65-84, D85+, RD and CVD; while for the hot effects of 3-day lags, the threshold was higher for the group of D15-64 and the thresholds were lower for the groups of D65-84 and RD. Temperature thresholds appeared to differ with age and death categories. The elderly and deaths from RD and CVD were more sensitive to temperature stress than the adult group. These findings may have implications in the assessment of temperature-related mortality and development of weather/health warning systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Information and communications technologies are a significant component of the healthcare domain, and electronic health records play a major role in it. Therefore, it is important that they are accepted en masse by healthcare professionals. How healthcare professionals perceive the usefulness of electronic health records and their attitudes towards them have been shown to have significant effects on the overall acceptance in many healthcare systems around the world. This paper investigates the role of perceived usefulness and attitude on the intention to use electronic health records by future healthcare professionals using polynomial regression with response surface analysis. Results show that the relationships between these variables are more complex than predicted in prior research. The paper concludes that the properties of the above determinants must be further investigated to clearly understand: (i) their role in predicting the intention to use electronic health records; and (ii) in designing systems that are better adopted by healthcare professionals of the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We first classify the state-of-the-art stream authentication problem in the multicast environment and group them into Signing and MAC approaches. A new approach for authenticating digital streams using Threshold Techniques is introduced. The new approach main advantages are in tolerating packet loss, up to a threshold number, and having a minimum space overhead. It is most suitable for multicast applications running over lossy, unreliable communication channels while, in same time, are pertain the security requirements. We use linear equations based on Lagrange polynomial interpolation and Combinatorial Design methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several recently proposed ciphers, for example Rijndael and Serpent, are built with layers of small S-boxes interconnected by linear key-dependent layers. Their security relies on the fact, that the classical methods of cryptanalysis (e.g. linear or differential attacks) are based on probabilistic characteristics, which makes their security grow exponentially with the number of rounds N r r. In this paper we study the security of such ciphers under an additional hypothesis: the S-box can be described by an overdefined system of algebraic equations (true with probability 1). We show that this is true for both Serpent (due to a small size of S-boxes) and Rijndael (due to unexpected algebraic properties). We study general methods known for solving overdefined systems of equations, such as XL from Eurocrypt’00, and show their inefficiency. Then we introduce a new method called XSL that uses the sparsity of the equations and their specific structure. The XSL attack uses only relations true with probability 1, and thus the security does not have to grow exponentially in the number of rounds. XSL has a parameter P, and from our estimations is seems that P should be a constant or grow very slowly with the number of rounds. The XSL attack would then be polynomial (or subexponential) in N r> , with a huge constant that is double-exponential in the size of the S-box. The exact complexity of such attacks is not known due to the redundant equations. Though the presented version of the XSL attack always gives always more than the exhaustive search for Rijndael, it seems to (marginally) break 256-bit Serpent. We suggest a new criterion for design of S-boxes in block ciphers: they should not be describable by a system of polynomial equations that is too small or too overdefined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the multicast stream authentication problem when an opponent can drop, reorder and inject data packets into the communication channel. In this context, bandwidth limitation and fast authentication are the core concerns. Therefore any authentication scheme is to reduce as much as possible the packet overhead and the time spent at the receiver to check the authenticity of collected elements. Recently, Tartary and Wang developed a provably secure protocol with small packet overhead and a reduced number of signature verifications to be performed at the receiver. In this paper, we propose an hybrid scheme based on Tartary and Wang’s approach and Merkle hash trees. Our construction will exhibit a smaller overhead and a much faster processing at the receiver making it even more suitable for multicast than the earlier approach. As Tartary and Wang’s protocol, our construction is provably secure and allows the total recovery of the data stream despite erasures and injections occurred during transmission.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An anonymous membership broadcast scheme is a method in which a sender broadcasts the secret identity of one out of a set of n receivers, in such a way that only the right receiver knows that he is the intended receiver, while the others can not determine any information about this identity (except that they know that they are not the intended ones). In a w-anonymous membership broadcast scheme no coalition of up to w receivers, not containing the selected receiver, is able to determine any information about the identity of the selected receiver. We present two new constructions of w-anonymous membership broadcast schemes. The first construction is based on error-correcting codes and we show that there exist schemes that allow a flexible choice of w while keeping the complexities for broadcast communication, user storage and required randomness polynomial in log n,. The second construction is based on the concept of collision-free arrays, which is introduced in this paper. The construction results in more flexible schemes, allowing trade-offs between different complexities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Firm-customer digital connectedness for effective sensing and responding is a strategic imperative for contemporary competitive firms. This research-in-progress paper conceptualizes and operationalizes the firm-customer mobile digital connectedness of a smart-mobile customer. The empirical investigation focuses on mobile app users and the impact of mobile apps on customer expectations. Based on pilot data collected from 127 customers, we tested hypotheses pertaining to firm-customer mobile digital connectedness and customer expectations. Our test analysis using linear and non-linear postulations reveals those customers raise their expectations as they increase their digital interactions with a firm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thin plate spline finite element methods are used to fit a surface to an irregularly scattered dataset [S. Roberts, M. Hegland, and I. Altas. Approximation of a Thin Plate Spline Smoother using Continuous Piecewise Polynomial Functions. SIAM, 1:208--234, 2003]. The computational bottleneck for this algorithm is the solution of large, ill-conditioned systems of linear equations at each step of a generalised cross validation algorithm. Preconditioning techniques are investigated to accelerate the convergence of the solution of these systems using Krylov subspace methods. The preconditioners under consideration are block diagonal, block triangular and constraint preconditioners [M. Benzi, G. H. Golub, and J. Liesen. Numerical solution of saddle point problems. Acta Numer., 14:1--137, 2005]. The effectiveness of each of these preconditioners is examined on a sample dataset taken from a known surface. From our numerical investigation, constraint preconditioners appear to provide improved convergence for this surface fitting problem compared to block preconditioners.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The foliage of a plant performs vital functions. As such, leaf models are required to be developed for modelling the plant architecture from a set of scattered data captured using a scanning device. The leaf model can be used for purely visual purposes or as part of a further model, such as a fluid movement model or biological process. For these reasons, an accurate mathematical representation of the surface and boundary is required. This paper compares three approaches for fitting a continuously differentiable surface through a set of scanned data points from a leaf surface, with a technique already used for reconstructing leaf surfaces. The techniques which will be considered are discrete smoothing D2-splines [R. Arcangeli, M. C. Lopez de Silanes, and J. J. Torrens, Multidimensional Minimising Splines, Springer, 2004.], the thin plate spline finite element smoother [S. Roberts, M. Hegland, and I. Altas, Approximation of a Thin Plate Spline Smoother using Continuous Piecewise Polynomial Functions, SIAM, 1 (2003), pp. 208--234] and the radial basis function Clough-Tocher method [M. Oqielat, I. Turner, and J. Belward, A hybrid Clough-Tocher method for surface fitting with application to leaf data., Appl. Math. Modelling, 33 (2009), pp. 2582-2595]. Numerical results show that discrete smoothing D2-splines produce reconstructed leaf surfaces which better represent the original physical leaf.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Environmental Kuznets Curve (EKC) hypothesises an inverse U-shaped relationship between a measure of environmental pollution and per capita income levels. In this study, we apply non-parametric estimation of local polynomial regression (local quadratic fitting) to allow more flexibility in local estimation. This study uses a larger and globally representative sample of many local and global pollutants and natural resources including Biological Oxygen Demand (BOD) emission, CO2 emission, CO2 damage, energy use, energy depletion, mineral depletion, improved water source, PM10, particulate emission damage, forest area and net forest depletion. Copyright © 2009 Inderscience Enterprises Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interpolation techniques for spatial data have been applied frequently in various fields of geosciences. Although most conventional interpolation methods assume that it is sufficient to use first- and second-order statistics to characterize random fields, researchers have now realized that these methods cannot always provide reliable interpolation results, since geological and environmental phenomena tend to be very complex, presenting non-Gaussian distribution and/or non-linear inter-variable relationship. This paper proposes a new approach to the interpolation of spatial data, which can be applied with great flexibility. Suitable cross-variable higher-order spatial statistics are developed to measure the spatial relationship between the random variable at an unsampled location and those in its neighbourhood. Given the computed cross-variable higher-order spatial statistics, the conditional probability density function (CPDF) is approximated via polynomial expansions, which is then utilized to determine the interpolated value at the unsampled location as an expectation. In addition, the uncertainty associated with the interpolation is quantified by constructing prediction intervals of interpolated values. The proposed method is applied to a mineral deposit dataset, and the results demonstrate that it outperforms kriging methods in uncertainty quantification. The introduction of the cross-variable higher-order spatial statistics noticeably improves the quality of the interpolation since it enriches the information that can be extracted from the observed data, and this benefit is substantial when working with data that are sparse or have non-trivial dependence structures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper offers an uncertainty quantification (UQ) study applied to the performance analysis of the ERCOFTAC conical diffuser. A deterministic CFD solver is coupled with a non-statistical generalised Polynomial Chaos(gPC)representation based on a pseudo-spectral projection method. Such approach has the advantage to not require any modification of the CFD code for the propagation of random disturbances in the aerodynamic field. The stochactic results highlihgt the importance of the inlet velocity uncertainties on the pressure recovery both alone and when coupled with a second uncertain variable. From a theoretical point of view, we investigate the possibility to build our gPC representation on arbitray grid, thus increasing the flexibility of the stochastic framework.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The degradation efficiencies and behaviors of caffeic acid (CaA), p-coumaric acid (pCoA) and ferulic acid (FeA) in aqueous sucrose solutions containing the mixture of these hydroxycinnamic acids (HCAs) mixtures were studied by the Fenton oxidation process. Central composite design and multi-response surface methodology were used to evaluate and optimize the interactive effects of process parameters. Four quadratic polynomial models were developed for the degradation of each individual acid in the mixture and the total HCAs degraded. Sucrose was the most influential parameter that significantly affected the total amount of HCA degraded. Under the conditions studied there was < 0.01% loss of sucrose in all reactions. The optimal values of the process parameters for a 200 mg/L HCA mixture in water (pH 4.73, 25.15 °C) and sucrose solution (13 mass%, pH 5.39, 35.98 °C) were 77% and 57% respectively. Regression analysis showed goodness of fit between the experimental results and the predicted values. The degradation behavior of CaA differed from those of pCoA and FeA, where further CaA degradation is observed at increasing sucrose and decreasing solution pH. The differences (established using UV/Vis and ATR-FTIR spectroscopy) were because, unlike the other acids, CaA formed a complex with Fe(III) or with Fe(III) hydrogen-bonded to sucrose, and coprecipitated with lepidocrocite, an iron oxyhydroxide.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stochastic modelling is critical in GNSS data processing. Currently, GNSS data processing commonly relies on the empirical stochastic model which may not reflect the actual data quality or noise characteristics. This paper examines the real-time GNSS observation noise estimation methods enabling to determine the observation variance from single receiver data stream. The methods involve three steps: forming linear combination, handling the ionosphere and ambiguity bias and variance estimation. Two distinguished ways are applied to overcome the ionosphere and ambiguity biases, known as the time differenced method and polynomial prediction method respectively. The real time variance estimation methods are compared with the zero-baseline and short-baseline methods. The proposed method only requires single receiver observation, thus applicable to both differenced and un-differenced data processing modes. However, the methods may be subject to the normal ionosphere conditions and low autocorrelation GNSS receivers. Experimental results also indicate the proposed method can result on more realistic parameter precision.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article aims to fill in the gap of the second-order accurate schemes for the time-fractional subdiffusion equation with unconditional stability. Two fully discrete schemes are first proposed for the time-fractional subdiffusion equation with space discretized by finite element and time discretized by the fractional linear multistep methods. These two methods are unconditionally stable with maximum global convergence order of $O(\tau+h^{r+1})$ in the $L^2$ norm, where $\tau$ and $h$ are the step sizes in time and space, respectively, and $r$ is the degree of the piecewise polynomial space. The average convergence rates for the two methods in time are also investigated, which shows that the average convergence rates of the two methods are $O(\tau^{1.5}+h^{r+1})$. Furthermore, two improved algorithms are constrcted, they are also unconditionally stable and convergent of order $O(\tau^2+h^{r+1})$. Numerical examples are provided to verify the theoretical analysis. The comparisons between the present algorithms and the existing ones are included, which show that our numerical algorithms exhibit better performances than the known ones.