109 resultados para Vector error correction model
em Queensland University of Technology - ePrints Archive
Resumo:
As order dependencies between process tasks can get complex, it is easy to make mistakes in process model design, especially behavioral ones such as deadlocks. Notions such as soundness formalize behavioral errors and tools exist that can identify such errors. However these tools do not provide assistance with the correction of the process models. Error correction can be very challenging as the intentions of the process modeler are not known and there may be many ways in which an error can be corrected. We present a novel technique for automatic error correction in process models based on simulated annealing. Via this technique a number of process model alternatives are identified that resolve one or more errors in the original model. The technique is implemented and validated on a sample of industrial process models. The tests show that at least one sound solution can be found for each input model and that the response times are short.
Resumo:
Error correction is perhaps the most widely used method for responding to student writing. While various studies have investigated the effectiveness of providing error correction, there has been relatively little research incorporating teachers' beliefs, practices, and students' preferences in written error correction. The current study adopted features of an ethnographic research design in order to explore the beliefs and practices of ESL teachers, and investigate the preferences of L2 students regarding written error correction in the context of a language institute situated in the Brisbane metropolitan district. In this study, two ESL teachers and two groups of adult intermediate L2 students were interviewed and observed. The beliefs and practices of the teachers were elicited through interviews and classroom observations. The preferences of L2 students were elicited through focus group interviews. Responses of the participants were encoded and analysed. Results of the teacher interviews showed that teachers believe that providing written error correction has advantages and disadvantages. Teachers believe that providing written error correction helps students improve their proof-reading skills in order to revise their writing more efficiently. However, results also indicate that providing written error correction is very time consuming. Furthermore, teachers prefer to provide explicit written feedback strategies during the early stages of the language course, and move to a more implicit strategy of providing written error correction in order to facilitate language learning. On the other hand, results of the focus group interviews suggest that students regard their teachers' practice of written error correction as important in helping them locate their errors and revise their writing. However, students also feel that the process of providing written error correction is time consuming. Nevertheless, students want and expect their teachers to provide written feedback because they believe that the benefits they gain from receiving feedback on their writing outweigh the apparent disadvantages of their teachers' written error correction strategies.
Resumo:
This paper considers the implications of the permanent/transitory decomposition of shocks for identification of structural models in the general case where the model might contain more than one permanent structural shock. It provides a simple and intuitive generalization of the influential work of Blanchard and Quah [1989. The dynamic effects of aggregate demand and supply disturbances. The American Economic Review 79, 655–673], and shows that structural equations with known permanent shocks cannot contain error correction terms, thereby freeing up the latter to be used as instruments in estimating their parameters. The approach is illustrated by a re-examination of the identification schemes used by Wickens and Motto [2001. Estimating shocks and impulse response functions. Journal of Applied Econometrics 16, 371–387], Shapiro and Watson [1988. Sources of business cycle fluctuations. NBER Macroeconomics Annual 3, 111–148], King et al. [1991. Stochastic trends and economic fluctuations. American Economic Review 81, 819–840], Gali [1992. How well does the ISLM model fit postwar US data? Quarterly Journal of Economics 107, 709–735; 1999. Technology, employment, and the business cycle: Do technology shocks explain aggregate fluctuations? American Economic Review 89, 249–271] and Fisher [2006. The dynamic effects of neutral and investment-specific technology shocks. Journal of Political Economy 114, 413–451].
Resumo:
The world we live in is well labeled for the benefit of humans but to date robots have made little use of this resource. In this paper we describe a system that allows robots to read and interpret visible text and use it to understand the content of the scene. We use a generative probabilistic model that explains spotted text in terms of arbitrary search terms. This allows the robot to understand the underlying function of the scene it is looking at, such as whether it is a bank or a restaurant. We describe the text spotting engine at the heart of our system that is able to detect and parse wild text in images, and the generative model, and present results from images obtained with a robot in a busy city setting.
Resumo:
At Eurocrypt’04, Freedman, Nissim and Pinkas introduced a fuzzy private matching problem. The problem is defined as follows. Given two parties, each of them having a set of vectors where each vector has T integer components, the fuzzy private matching is to securely test if each vector of one set matches any vector of another set for at least t components where t < T. In the conclusion of their paper, they asked whether it was possible to design a fuzzy private matching protocol without incurring a communication complexity with the factor (T t ) . We answer their question in the affirmative by presenting a protocol based on homomorphic encryption, combined with the novel notion of a share-hiding error-correcting secret sharing scheme, which we show how to implement with efficient decoding using interleaved Reed-Solomon codes. This scheme may be of independent interest. Our protocol is provably secure against passive adversaries, and has better efficiency than previous protocols for certain parameter values.
Resumo:
Integration of biometrics is considered as an attractive solution for the issues associated with password based human authentication as well as for secure storage and release of cryptographic keys which is one of the critical issues associated with modern cryptography. However, the widespread popularity of bio-cryptographic solutions are somewhat restricted by the fuzziness associated with biometric measurements. Therefore, error control mechanisms must be adopted to make sure that fuzziness of biometric inputs can be sufficiently countered. In this paper, we have outlined such existing techniques used in bio-cryptography while explaining how they are deployed in different types of solutions. Finally, we have elaborated on the important facts to be considered when choosing appropriate error correction mechanisms for a particular biometric based solution.
Resumo:
Read the original article via 'related work' link below
Resumo:
An unstructured mesh �nite volume discretisation method for simulating di�usion in anisotropic media in two-dimensional space is discussed. This technique is considered as an extension of the fully implicit hybrid control-volume �nite-element method and it retains the local continuity of the ux at the control volume faces. A least squares function recon- struction technique together with a new ux decomposition strategy is used to obtain an accurate ux approximation at the control volume face, ensuring that the overall accuracy of the spatial discretisation maintains second order. This paper highlights that the new technique coincides with the traditional shape function technique when the correction term is neglected and that it signi�cantly increases the accuracy of the previous linear scheme on coarse meshes when applied to media that exhibit very strong to extreme anisotropy ratios. It is concluded that the method can be used on both regular and irregular meshes, and appears independent of the mesh quality.
Resumo:
This study investigates the short-run dynamics and long-run equilibrium relationship between residential electricity demand and factors influencing demand - per capita income, price of electricity, price of kerosene oil and price of liquefied petroleum gas - using annual data for Sri Lanka for the period, 1960-2007. The study uses unit root, cointegration and error-correction models. The long-run demand elasticities of income, own price and price of kerosene oil (substitute) were estimated to be 0.78, - 0.62, and 0.14 respectively. The short-run elasticities for the same variables were estimated to be 032, - 0.16 and 0.10 respectively. Liquefied petroleum (LP) gas is a substitute for electricity only in the short-run with an elasticity 0.09. The main findings of the paper support the following (1) increasing the price of electricity is not the most effective tool to reduce electricity consumption (2) existing subsidies on electricity consumption can be removed without reducing government revenue (3) the long-run income elasticity of demand shows that any future increase in household incomes is likely to significantly increase the demand for electricity and(4) any power generation plans which consider only current per capita consumption and population growth should be revised taking into account the potential future income increases in order to avoid power shortages ill the country.
Resumo:
Secret-sharing schemes describe methods to securely share a secret among a group of participants. A properly constructed secret-sharing scheme guarantees that the share belonging to one participant does not reveal anything about the shares of others or even the secret itself. Besides the obvious feature which is to distribute a secret, secret-sharing schemes have also been used in secure multi-party computations and redundant residue number systems for error correction codes. In this paper, we propose that the secret-sharing scheme be used as a primitive in a Network-based Intrusion Detection System (NIDS) to detect attacks in encrypted networks. Encrypted networks such as Virtual Private Networks (VPNs) fully encrypt network traffic which can include both malicious and non-malicious traffic. Traditional NIDS cannot monitor encrypted traffic. Our work uses a combination of Shamir's secret-sharing scheme and randomised network proxies to enable a traditional NIDS to function normally in a VPN environment. In this paper, we introduce a novel protocol that utilises a secret-sharing scheme to detect attacks in encrypted networks.
Resumo:
Secret-sharing schemes describe methods to securely share a secret among a group of participants. A properly constructed secret-sharing scheme guarantees that the share belonging to one participant does not reveal anything about the shares of others or even the secret itself. Besides being used to distribute a secret, secret-sharing schemes have also been used in secure multi-party computations and redundant residue number systems for error correction codes. In this paper, we propose that the secret-sharing scheme be used as a primitive in a Network-based Intrusion Detection System (NIDS) to detect attacks in encrypted Networks. Encrypted networks such as Virtual Private Networks (VPNs) fully encrypt network traffic which can include both malicious and non-malicious traffic. Traditional NIDS cannot monitor such encrypted traffic. We therefore describe how our work uses a combination of Shamir's secret-sharing scheme and randomised network proxies to enable a traditional NIDS to function normally in a VPN environment.