17 resultados para Singular value decomposition

em Deakin Research Online - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper outlines a numerical algorithm to implement the concept of Functional Observability introduced in [6] based on a Singular Value Decomposition approach. The key feature of this algorithm is in outputting a minimum number of additional linear functions of the state vector when the system is Functional Observable, these additional functions are required to design the smallest possible order functional observer as stated in [6].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Personalized recommendation is, according to the user's interest characteristics and purchasing behavior, to recommend information and goods to users in which they may be interested. With the rapid development of Internet technology, we have entered the era of information explosion, where huge amounts of information are presented at the same time. On one hand, it is difficult for the user to discover information in which he is most interested, on the other hand, general users experience difficult in obtaining information which very few people browse. In order to extract information in which the user is interested from a massive amount of data, we propose a personalized recommendation algorithm based on approximating the singular value decomposition (SVD) in this paper. SVD is a powerful technique for dimensionality reduction. However, due to its expensive computational requirements and weak performance for large sparse matrices, it has been considered inappropriate for practical applications involving massive data. Finally, we present an empirical study to compare the prediction accuracy of our proposed algorithm with that of Drineas's LINEARTIMESVD algorithm and the standard SVD algorithm on the Movie Lens dataset, and show that our method has the best prediction quality. © 2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents two hyperlink analysis-based algorithms to find relevant pages for a given Web page (URL). The first algorithm comes from the extended cocitation analysis of the Web pages. It is intuitive and easy to implement. The second one takes advantage of linear algebra theories to reveal deeper relationships among the Web pages and to identify relevant pages more precisely and effectively. The experimental results show the feasibility and effectiveness of the algorithms. These algorithms could be used for various Web applications, such as enhancing Web search. The ideas and techniques in this work would be helpful to other Web-related researches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

UDDI is a standard for publishing and discovery of web services. UDDI registries provide keyword searches for web services. The search functionality is very simple and fails to account for relationships between web services. In this paper, we propose an algorithm which retrieves closely related web services. The proposed algorithm is based on singular value decomposition (SVD) in linear algebra, which reveals semantic relationships among web services. The preliminary evaluation shows the effectiveness and feasibility of the algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The method of Fields and Backofen has been commonly used to reduce the data obtained by hot torsion test into flow curves. The method, however, is most suitable for materials with monotonic strain hardening behaviour. Other methods such as Stüwe’s method, tubular specimens, differential testing and the inverse method, each suffer from similar drawbacks. It is shown in the current work that for materials with multiple regimes of hardening any method based on an assumption of constant hardening indices introduces some errors into the flow curve obtained from the hot torsion test. Therefore such methods do not enable accurate prediction of onset of recrystallisation where slow softening occurs. A new method to convert results from the hot torsion test into flow curves by taking into account the variation of constitutive parameters during deformation is presented. The method represents the torque twist data by a parametric linear least square model in which Euler and hyperbolic coefficients are used as the parameters. A closed form relationship obtained from the mathematical representation of the data is employed next for flow stress determination. Two different solution strategies, the method of normal equations and singular value decomposition, were used for parametric modelling of the data with hyperbolic basis functions. The performance of both methods is compared. Experimental data obtained by FHTTM, a flexible hot torsion test machine developed at IROST, for a C–Mn austenitic steel was used to demonstrate the method. The results were compared with those obtained using constant strain and strain rate hardening characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

How to learn an over complete dictionary for sparse representations of image is an important topic in machine learning, sparse coding, blind source separation, etc. The so-called K-singular value decomposition (K-SVD) method [3] is powerful for this purpose, however, it is too time-consuming to apply. Recently, an adaptive orthogonal sparsifying transform (AOST) method has been developed to learn the dictionary that is faster. However, the corresponding coefficient matrix may not be as sparse as that of K-SVD. For solving this problem, in this paper, a non-orthogonal iterative match method is proposed to learn the dictionary. By using the approach of sequentially extracting columns of the stacked image blocks, the non-orthogonal atoms of the dictionary are learned adaptively, and the resultant coefficient matrix is sparser. Experiment results show that the proposed method can yield effective dictionaries and the resulting image representation is sparser than AOST.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the rapid development of Internet, the amount of information on the Web grows explosively, people often feel puzzled and helpless in finding and getting the information they really need. For overcoming this problem, recommender systems such as singular value decomposition (SVD) method help users finding relevant information, products or services by providing personalized recommendations based on their profiles. SVD is a powerful technique for dimensionality reduction. However, due to its expensive computational requirements and weak performance for large sparse matrices, it has been considered inappropriate for practical applications involving massive data. Thus, to extract information in which the user is interested from a massive amount of data, we propose a personalized recommendation algorithm which is called ApproSVD algorithm based on approximating SVD in this paper. The trick behind our algorithm is to sample some rows of a user-item matrix, rescale each row by an appropriate factor to form a relatively smaller matrix, and then reduce the dimensionality of the smaller matrix. Finally, we present an empirical study to compare the prediction accuracy of our proposed algorithm with that of Drineas's LINEARTIMESVD algorithm and the standard SVD algorithm on MovieLens dataset and Flixster dataset, and show that our method has the best prediction quality. Furthermore, in order to show the superiority of the ApproSVD algorithm, we also conduct an empirical study to compare the prediction accuracy and running time between ApproSVD algorithm and incremental SVD algorithm on MovieLens dataset and Flixster dataset, and demonstrate that our proposed method has better performance overall.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the serious information overload problem on the Internet, recommender systems have emerged as an important tool for recommending more useful information to users by providing personalized services for individual users. However, in the “big data“ era, recommender systems face significant challenges, such as how to process massive data efficiently and accurately. In this paper we propose an incremental algorithm based on singular value decomposition (SVD) with good scalability, which combines the Incremental SVD algorithm with the Approximating the Singular Value Decomposition (ApproSVD) algorithm, called the Incremental ApproSVD. Furthermore, strict error analysis demonstrates the effectiveness of the performance of our Incremental ApproSVD algorithm. We then present an empirical study to compare the prediction accuracy and running time between our Incremental ApproSVD algorithm and the Incremental SVD algorithm on the MovieLens dataset and Flixster dataset. The experimental results demonstrate that our proposed method outperforms its counterparts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

 Microsoft Kinect which has been primarily aimed at the computer gaming industry has been used in bio-kinematic research related implementations. A multi-Kinect system can be useful in exploiting spatial diversity to increase measurement accuracy. One of the main problems in deploying multi-Kinect systems is to estimate the pose, including the position and orientation of each Kinect. In this paper, a singular value decomposition (SVD) least-squares algorithm is extended to a more generic time-series based approach to solve this pose estimation problem utilising 3D positions of one or more joints in skeletons obtained from a multi-Kinect system. Additionally, computer simulations are performed to demonstrate the use and to evaluate the efficiency of the proposed algorithm. The former is further validated with a commercial Vicon system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Long-term, off-site human monitoring systems are emerging with respect to the skyrocketing expenditures engaged with rehabilitation therapies for neurological diseases. Inertial/magnetic sensor modules are well known as a worthy solution for this problem. Much attention and effort are being paid for minimizing drift problem of angular rates, yet the rest of kinematic measurements (earth’s magnetic field and gravitational orientation) are only themselves capable enough to track movements applying the theory for solving historicalWahbas Problem. Further, these solutions give a closed form solution which makes it mostly suitable for real time Mo-Cap systems. This paper examines the feasibility of some typical solutions of Wahba’s Problem named TRIAD method, Davenport’s q method, Singular Value Decomposition method and QUEST algorithm upon current inertial/magnetic sensor measurements for tracking human arm movements. Further, the theoretical assertions are compared through controlled experiments with both simulated and actual accelerometer and magnetometer measurements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fault-tolerant motion of redundant manipulators can be obtained by joint velocity reconfiguration. For fault-tolerant manipulators, it is beneficial to determine the configurations that can tolerate the locked-joint failures with a minimum relative joint velocity jump, because the manipulator can rapidly reconfigure itself to tolerate the fault. This paper uses the properties of the condition numbers to introduce those optimal configurations for serial manipulators. The relationship between the manipulator's locked-joint failures and the condition number of the Jacobian matrix is indicated by using a matrix perturbation methodology. Then, it is observed that the condition number provides an upper bound of the required relative joint velocity change for recovering the faults which leads to define the optimal fault-tolerant configuration from the minimization of the condition number. The optimization problem to obtain the minimum condition number is converted to three standard Eigen value optimization problems. A solution is for selected optimization problem is presented. Finally, in order to obtain the optimal fault-tolerant configuration, the proposed method is applied to a 4-DoF planar manipulator.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper considers the sum-rate of wireless broadcast systems with multiple antennas at the base station. In a conventional MIMO-BC system with a large number of users, selecting an optimal subset of users to maximizing the overall system capacity is a key design issue. This paper presents a novel approach to investigate the sum-rate using Eigen Value Decomposition (EVD). Particularly, we derive the lower bound on sum-rate of a conventional MIMO-BC using a completely different approach compared to the existing approaches. The paper formulates the rate maximization problem for any number of users and any number of transmitting antennas using EVD approach of the channel matrix. This also shows the impact of channel angle information on the sum-rate of conventional MIMO-BC. Numerical results confirm the benefits of our technique in various MIMO communication scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to analyse the interdependencies of the house price growth rates in Australian capital cities.
Design/methodology/approach - A vector autoregression model and variance decomposition are introduced to estimate and interpret the interdependences among the growth rates of regional house prices in Australia.
Findings - The results suggest the eight capital cities can be divided into three groups: Sydney and Melbourne; Canberra, Adelaide and Brisbane; and Hobart, Perth and Darwin.
Originality/value - Based on the structural vector autoregression model, this research develops an innovative interdependence analysis approach of regional house prices based on a variance decomposition method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – This paper develops a new decomposition method of the housing market variations to analyse the housing dynamics of the Australian eight capital cities.
Design/methodology/approach – This study reviews the prior research on analysing the housing market variations and classifies the previous methods into four main models. Based on this, the study develops a new decomposition of the variations, which is made up of regional information, homemarket information and time information. The panel data regression method, unit root test and F test are adopted to construct the model and interpret the housing market variations of the Australian capital cities.
Findings – This paper suggests that the Australian home-market information has the same elasticity to the housing market variations across cities and time. In contrast, the elasticities of the regional information are distinguished. However, similarities exit in the west and north of Australia or the south and east of Australia. The time information contributes differently along the observing period, although the similarities are found in certain periods.
Originality/value – This paper introduces the housing market variation decomposition into the research of housing market variations and develops a model based on the new method of the housing market variation decomposition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The prevailing approach to the problem of the ontological status of mathematical entities such as numbers and sets is to ask in what sense it is legitimate to ascribe a reference to abstract singular terms; those expressions of our language which, taken at face value, denote abstract objects. On the basis of this approach, neo-Fregean Abstractionists such as Hale and Wright have argued that abstract singular terms may be taken to effect genuine reference towards objects, whereas nominalists such as Field have asserted that these apparent ontological commitments should not be taken at face value. In this article I argue for an intermediate position which upholds the legitimacy of ascribing a reference to abstract singular terms in an attenuated sense relative to the more robust ascription of reference applicable to names denoting concrete entities. In so doing I seek to clear up some confusions regarding the ramifications of such a thin notion of reference for ontological claims about mathematical objects.