946 resultados para Automatic merging of lexical resources


Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the key issues in e-learning environments is the possibility of creating and evaluating exercises. However, the lack of tools supporting the authoring and automatic checking of exercises for specifics topics (e.g., geometry) drastically reduces advantages in the use of e-learning environments on a larger scale, as usually happens in Brazil. This paper describes an algorithm, and a tool based on it, designed for the authoring and automatic checking of geometry exercises. The algorithm dynamically compares the distances between the geometric objects of the student`s solution and the template`s solution, provided by the author of the exercise. Each solution is a geometric construction which is considered a function receiving geometric objects (input) and returning other geometric objects (output). Thus, for a given problem, if we know one function (construction) that solves the problem, we can compare it to any other function to check whether they are equivalent or not. Two functions are equivalent if, and only if, they have the same output when the same input is applied. If the student`s solution is equivalent to the template`s solution, then we consider the student`s solution as a correct solution. Our software utility provides both authoring and checking tools to work directly on the Internet, together with learning management systems. These tools are implemented using the dynamic geometry software, iGeom, which has been used in a geometry course since 2004 and has a successful track record in the classroom. Empowered with these new features, iGeom simplifies teachers` tasks, solves non-trivial problems in student solutions and helps to increase student motivation by providing feedback in real time. (c) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This project is based on Artificial Intelligence (A.I) and Digital Image processing (I.P) for automatic condition monitoring of sleepers in the railway track. Rail inspection is a very important task in railway maintenance for traffic safety issues and in preventing dangerous situations. Monitoring railway track infrastructure is an important aspect in which the periodical inspection of rail rolling plane is required.Up to the present days the inspection of the railroad is operated manually by trained personnel. A human operator walks along the railway track searching for sleeper anomalies. This monitoring way is not more acceptable for its slowness and subjectivity. Hence, it is desired to automate such intuitive human skills for the development of more robust and reliable testing methods. Images of wooden sleepers have been used as data for my project. The aim of this project is to present a vision based technique for inspecting railway sleepers (wooden planks under the railway track) by automatic interpretation of Non Destructive Test (NDT) data using A.I. techniques in determining the results of inspection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Condition monitoring of wooden railway sleepers applications are generallycarried out by visual inspection and if necessary some impact acoustic examination iscarried out intuitively by skilled personnel. In this work, a pattern recognition solutionhas been proposed to automate the process for the achievement of robust results. Thestudy presents a comparison of several pattern recognition techniques together withvarious nonstationary feature extraction techniques for classification of impactacoustic emissions. Pattern classifiers such as multilayer perceptron, learning cectorquantization and gaussian mixture models, are combined with nonstationary featureextraction techniques such as Short Time Fourier Transform, Continuous WaveletTransform, Discrete Wavelet Transform and Wigner-Ville Distribution. Due to thepresence of several different feature extraction and classification technqies, datafusion has been investigated. Data fusion in the current case has mainly beeninvestigated on two levels, feature level and classifier level respectively. Fusion at thefeature level demonstrated best results with an overall accuracy of 82% whencompared to the human operator.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Managers’ conceptions of the importance of human resources are essential for creating ‘attractive workplaces’. This paper examines an intervention method aimed at creating insight among managers in small and medium-sized enterprises (SMEs) concerning the potential of human resources. The intervention method is called Focus Group Attractive Work (FGAW) and was conducted at eight enterprises in Sweden. Based on the analysis, it is concluded that the intervention method seems to be thought-provoking and to facilitate changes in managers’ conceptions of the importance of human resources, albeit to different degrees. 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes some forms of linguistic manipulation in Japanese in newspapers when reporting on North Korea and its nuclear tests. The focus lies on lexical ambiguity in headlines and journalist’s voices in the body of the articles, that results in manipulation of the minds of the readers. The study is based on a corpus of nine articles from two of Japan’s largest newspapers Yomiuri Online and Asahi Shimbun Digital. The linguistic phenomenon that contribute to create manipulation are divided into Short Term Memory impact or Long Term Memory impact and examples will be discussed under each of the categories.The main results of the study are that headlines in Japanese newspapers do not make use of an ambiguous, double grounded structure. However, the articles are filled with explicit and implied attitudes as well as attributed material from people of a high social status, which suggests that manipulation of the long term memory is a tool used in Japanese media.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sam Atmore, Associate Director of Media Resources, reading Earth by David Brin (PS3552 R4825 E27 1990) Added to gallery 11/1/2010

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lucas (2000) estimates that the US welfare costs of inflation are around 1% of GDP. This measurement is consistent with a speci…c distorting channel in terms of the Bailey triangle under the demand for monetary base schedule (outside money): the displacement of resources from the production of consumption goods to the household transaction time à la Baumol. Here, we consider also several new types of distortions in the manufacturing and banking industries. Our new evidences show that both banks and firms demand special occupational employments to avoid the inflation tax. We de…ne the concept of ”the foat labor”: The occupational employments that are aflected by the in‡ation rates. More administrative workers are hired relatively to the bluecollar workers for producing consumption goods. This new phenomenon makes the manufacturing industry more roundabout. To take into account this new stylized fact and others, we redo at same time both ”The model 5: A Banking Sector -2” formulated by Lucas (1993) and ”The Competitive Banking System” proposed by Yoshino (1993). This modelling allows us to characterize better the new types of misallocations. We …nd that the maximum value of the resources wasted by the US economy happened in the years 1980-81, after the 2nd oil shock. In these years, we estimate the excess resources that are allocated for every speci…c distorting channel: i) The US commercial banks spent additional resources of around 2% of GDP; ii) For the purpose of the firm foating time were used between 2.4% and 4.1% of GDP); and iii) For the household transaction time were allocated between 3.1% and 4.5 % of GDP. The Bailey triangle under the demand for the monetary base schedule represented around 1% of GDP, which is consistent with Lucas (2000). We estimate that the US total welfare costs of in‡ation were around 10% of GDP in terms of the consumption goods foregone. The big di¤erence between our results and Lucas (2000) are mainly due to the Harberger triangle in the market for loans (inside money) which makes part of the household transaction time, of the …rm ‡oat labor and of the distortion in the banking industry. This triangle arises due to the widening interest rates spread in the presence of a distorting inflation tax and under a fractionally reserve system. The Harberger triangle can represent 80% of the total welfare costs of inflation while the remaining percentage is split almost equally between the Bailey triangle and the resources used for the bank services. Finally, we formulate several theorems in terms of the optimal nonneutral monetary policy so as to compare with the classical monetary theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esse trabalho tem por objetivo o desenvolvimento de um sistema inteligente para detecção da queima no processo de retificação tangencial plana através da utilização de uma rede neural perceptron multi camadas, treinada para generalizar o processo e, conseqüentemente, obter o limiar de queima. em geral, a ocorrência da queima no processo de retificação pode ser detectada pelos parâmetros DPO e FKS. Porém esses parâmetros não são eficientes nas condições de usinagem usadas nesse trabalho. Os sinais de emissão acústica e potência elétrica do motor de acionamento do rebolo são variáveis de entrada e a variável de saída é a ocorrência da queima. No trabalho experimental, foram empregados um tipo de aço (ABNT 1045 temperado) e um tipo de rebolo denominado TARGA, modelo ART 3TG80.3 NVHB.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: The objective of this study was to analyze the spatial behavior of the occurrence of trachoma cases detected in the City of Bauru, State of São Paulo, Brazil, in 2006 in order to use the information collected to set priority areas for optimization of health resources. Methods: the trachoma cases identified in 2006 were georeferenced. The data evaluated were: schools where the trachoma cases studied, data from the 2000 Census, census tract, type of housing, water supply conditions, distribution of income and levels of education of household heads. In the Google Earth® software and TerraView® were made descriptive spatial analysis and estimates of the Kernel. Each area was studied by interpolation of the density surfaces exposing events to facilitate to recognize the clusters. Results: of the 66 cases detected, only one (1.5%) was not a resident of the city's outskirts. A positive association was detected of trachoma cases and the percentage of heads of household with income below three minimum wages and schooling under eight years of education. Conclusions: The recognition of the spatial distribution of trachoma cases coincided with the areas of greatest social inequality in Bauru City. The micro-areas identified are those that should be prioritized in the rationalization of health resources. There is the possibility of using the trachoma cases detected as an indicator of performance of micro priority health programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work deals with noise removal by the use of an edge preserving method whose parameters are automatically estimated, for any application, by simply providing information about the standard deviation noise level we wish to eliminate. The desired noiseless image u(x), in a Partial Differential Equation based model, can be viewed as the solution of an evolutionary differential equation u t(x) = F(u xx, u x, u, x, t) which means that the true solution will be reached when t ® ¥. In practical applications we should stop the time ''t'' at some moment during this evolutionary process. This work presents a sufficient condition, related to time t and to the standard deviation s of the noise we desire to remove, which gives a constant T such that u(x, T) is a good approximation of u(x). The approach here focused on edge preservation during the noise elimination process as its main characteristic. The balance between edge points and interior points is carried out by a function g which depends on the initial noisy image u(x, t0), the standard deviation of the noise we want to eliminate and a constant k. The k parameter estimation is also presented in this work therefore making, the proposed model automatic. The model's feasibility and the choice of the optimal time scale is evident through out the various experimental results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)