853 resultados para Tutorial on Computing
Resumo:
In this paper we compute some bounds of the Balaban index and then by means of group action we compute the Balaban index of vertex transitive graphs. ACM Computing Classification System (1998): G.2.2 , F.2.2.
Resumo:
This Proceedings volume contains articles presented at the CIRP-Sponsored International Conference on Digital Enterprise Technology (DET2009) that takes place 14-16 December 2009 in Hong Kong. This is the 6th DET conference in the series and the first to be held in Asia. Professor Paul Maropoulos initiated, hosted and chaired the 1st International DET Conference held in 2002 at the University of Durham. Since this inaugural first DET conference, DET conference series has been successfully held in 2004 at Seattle, Washington USA, in 2006 at Setubal Portugal, in 2007 at Bath England, and in 2008 at Nantes France. The DET2009 conference continues to bring together International expertise from the academic and industrial fields, pushing forward the boundaries of research knowledge and best practice in digital enterprise technology for design and manufacturing, and logistics and supply chain management. Over 120 papers from over 10 countries have been accepted for presentation at DET2009 and inclusion in this Proceedings volume after stringent refereeing process.
Resumo:
We describe an approach for recovering the plaintext in block ciphers having a design structure similar to the Data Encryption Standard but with improperly constructed S-boxes. The experiments with a backtracking search algorithm performing this kind of attack against modified DES/Triple-DES in ECB mode show that the unknown plaintext can be recovered with a small amount of uncertainty and this algorithm is highly efficient both in time and memory costs for plaintext sources with relatively low entropy. Our investigations demonstrate once again that modifications resulting to S-boxes which still satisfy some design criteria may lead to very weak ciphers. ACM Computing Classification System (1998): E.3, I.2.7, I.2.8.
Resumo:
In 1900 E. B. Van Vleck proposed a very efficient method to compute the Sturm sequence of a polynomial p (x) ∈ Z[x] by triangularizing one of Sylvester’s matrices of p (x) and its derivative p′(x). That method works fine only for the case of complete sequences provided no pivots take place. In 1917, A. J. Pell and R. L. Gordon pointed out this “weakness” in Van Vleck’s theorem, rectified it but did not extend his method, so that it also works in the cases of: (a) complete Sturm sequences with pivot, and (b) incomplete Sturm sequences. Despite its importance, the Pell-Gordon Theorem for polynomials in Q[x] has been totally forgotten and, to our knowledge, it is referenced by us for the first time in the literature. In this paper we go over Van Vleck’s theorem and method, modify slightly the formula of the Pell-Gordon Theorem and present a general triangularization method, called the VanVleck-Pell-Gordon method, that correctly computes in Z[x] polynomial Sturm sequences, both complete and incomplete.
Resumo:
ACM Computing Classification System (1998): G.1.1, G.1.2.
Resumo:
This research evaluates pattern recognition techniques on a subclass of big data where the dimensionality of the input space (p) is much larger than the number of observations (n). Specifically, we evaluate massive gene expression microarray cancer data where the ratio κ is less than one. We explore the statistical and computational challenges inherent in these high dimensional low sample size (HDLSS) problems and present statistical machine learning methods used to tackle and circumvent these difficulties. Regularization and kernel algorithms were explored in this research using seven datasets where κ < 1. These techniques require special attention to tuning necessitating several extensions of cross-validation to be investigated to support better predictive performance. While no single algorithm was universally the best predictor, the regularization technique produced lower test errors in five of the seven datasets studied.
Resumo:
Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2015
Resumo:
Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2015
Resumo:
Analysis of risk measures associated with price series data movements and its predictions are of strategic importance in the financial markets as well as to policy makers in particular for short- and longterm planning for setting up economic growth targets. For example, oilprice risk-management focuses primarily on when and how an organization can best prevent the costly exposure to price risk. Value-at-Risk (VaR) is the commonly practised instrument to measure risk and is evaluated by analysing the negative/positive tail of the probability distributions of the returns (profit or loss). In modelling applications, least-squares estimation (LSE)-based linear regression models are often employed for modeling and analyzing correlated data. These linear models are optimal and perform relatively well under conditions such as errors following normal or approximately normal distributions, being free of large size outliers and satisfying the Gauss-Markov assumptions. However, often in practical situations, the LSE-based linear regression models fail to provide optimal results, for instance, in non-Gaussian situations especially when the errors follow distributions with fat tails and error terms possess a finite variance. This is the situation in case of risk analysis which involves analyzing tail distributions. Thus, applications of the LSE-based regression models may be questioned for appropriateness and may have limited applicability. We have carried out the risk analysis of Iranian crude oil price data based on the Lp-norm regression models and have noted that the LSE-based models do not always perform the best. We discuss results from the L1, L2 and L∞-norm based linear regression models. ACM Computing Classification System (1998): B.1.2, F.1.3, F.2.3, G.3, J.2.
Resumo:
This article shows the social importance of subsistence minimum in Georgia. The methodology of its calculation is also shown. We propose ways of improving the calculation of subsistence minimum in Georgia and how to extend it for other developing countries. The weights of food and non-food expenditures in the subsistence minimum baskets are essential in these calculations. Daily consumption value of the minimum food basket has been calculated too. The average consumer expenditures on food supply and the other expenditures to the share are considered in dynamics. Our methodology of the subsistence minimum calculation is applied for the case of Georgia. However, it can be used for similar purposes based on data from other developing countries, where social stability is achieved, and social inequalities are to be actualized. ACM Computing Classification System (1998): H.5.3, J.1, J.4, G.3.
Resumo:
The paper deals with a single server finite queuing system where the customers, who failed to get service, are temporarily blocked in the orbit of inactive customers. This model and its variants have many applications, especially for optimization of the corresponding models with retrials. We analyze the system in non-stationary regime and, using the discrete transformations method study, the busy period length and the number of successful calls made during it. ACM Computing Classification System (1998): G.3, J.7.
Resumo:
This paper is dedicated to Prof. Nikolay Kyurkchiev on the occasion of his 70th anniversary This paper gives sufficient conditions for kth approximations of the zeros of polynomial f (x) under which Kyurkchiev’s method fails on the next step. The research is linked with an attack on the global convergence hypothesis of this commonly used in practice method (as correlate hypothesis for Weierstrass–Dochev’s method). Graphical examples are presented.
Resumo:
We introduce a modification of the familiar cut function by replacing the linear part in its definition by a polynomial of degree p + 1 obtaining thus a sigmoid function called generalized cut function of degree p + 1 (GCFP). We then study the uniform approximation of the (GCFP) by smooth sigmoid functions such as the logistic and the shifted logistic functions. The limiting case of the interval-valued Heaviside step function is also discussed which imposes the use of Hausdorff metric. Numerical examples are presented using CAS MATHEMATICA.
Resumo:
In 1917 Pell (1) and Gordon used sylvester2, Sylvester’s little known and hardly ever used matrix of 1853, to compute(2) the coefficients of a Sturmian remainder — obtained in applying in Q[x], Sturm’s algorithm on two polynomials f, g ∈ Z[x] of degree n — in terms of the determinants (3) of the corresponding submatrices of sylvester2. Thus, they solved a problem that had eluded both J. J. Sylvester, in 1853, and E. B. Van Vleck, in 1900. (4) In this paper we extend the work by Pell and Gordon and show how to compute (2) the coefficients of an Euclidean remainder — obtained in finding in Q[x], the greatest common divisor of f, g ∈ Z[x] of degree n — in terms of the determinants (5) of the corresponding submatrices of sylvester1, Sylvester’s widely known and used matrix of 1840. (1) See the link http://en.wikipedia.org/wiki/Anna_Johnson_Pell_Wheeler for her biography (2) Both for complete and incomplete sequences, as defined in the sequel. (3) Also known as modified subresultants. (4) Using determinants Sylvester and Van Vleck were able to compute the coefficients of Sturmian remainders only for the case of complete sequences. (5) Also known as (proper) subresultants.
Resumo:
We investigate a recently introduced width measure of planar shapes called sweepwidth and prove a lower bound theorem on the sweepwidth.