895 resultados para FEC using Reed-Solomon-like codes
Resumo:
A simple, rapid, and low-cost coulometric method for direct detection of glyphosate and aminomethylphosphonic acid (AMPA) in water samples using anion-exchange chromatography and coulometric detection with copper electrode is presented. Under optimized conditions, the limits of detection (LODs) (S/N = 3) were 0.038 mu g ml(-1) for glyphosate and 0.24 mu g ml(-1) for AMPA, without any preconcentration method. The calibration curves were linear and presented an excellent correlation coefficient. The method was successfully applied to the determination of glyphosate and AMPA in water samples without any kind of extraction, clean-up, or preconcentration step. No interferent was found in the water, like this, the recovery was, practically, 100%. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
The ubiquity and power of personal digital devices make them attractive tools for STEM instructors who would like to stimulate active learning. These devices offer both abundant pedagogical opportunities and worrisome challenges. We will discuss our two years of experience in using mobile devices to teach biology in a community college setting, as well as our observations on the best ways to organize digital-based activities to facilitate student active learning.
Resumo:
The automated timetabling and scheduling is one of the hardest problem areas. This isbecause of constraints and satisfying those constraints to get the feasible and optimizedschedule, and it is already proved as an NP Complete (1) [1]. The basic idea behind this studyis to investigate the performance of Genetic Algorithm on general scheduling problem underpredefined constraints and check the validity of results, and then having comparative analysiswith other available approaches like Tabu search, simulated annealing, direct and indirectheuristics [2] and expert system. It is observed that Genetic Algorithm is good solutiontechnique for solving such problems and later analysis will prove this argument. The programis written in C++ and analysis is done by using variation in various parameters.
Resumo:
The motivation for this thesis work is the need for improving reliability of equipment and quality of service to railway passengers as well as a requirement for cost-effective and efficient condition maintenance management for rail transportation. This thesis work develops a fusion of various machine vision analysis methods to achieve high performance in automation of wooden rail track inspection.The condition monitoring in rail transport is done manually by a human operator where people rely on inference systems and assumptions to develop conclusions. The use of conditional monitoring allows maintenance to be scheduled, or other actions to be taken to avoid the consequences of failure, before the failure occurs. Manual or automated condition monitoring of materials in fields of public transportation like railway, aerial navigation, traffic safety, etc, where safety is of prior importance needs non-destructive testing (NDT).In general, wooden railway sleeper inspection is done manually by a human operator, by moving along the rail sleeper and gathering information by visual and sound analysis for examining the presence of cracks. Human inspectors working on lines visually inspect wooden rails to judge the quality of rail sleeper. In this project work the machine vision system is developed based on the manual visual analysis system, which uses digital cameras and image processing software to perform similar manual inspections. As the manual inspection requires much effort and is expected to be error prone sometimes and also appears difficult to discriminate even for a human operator by the frequent changes in inspected material. The machine vision system developed classifies the condition of material by examining individual pixels of images, processing them and attempting to develop conclusions with the assistance of knowledge bases and features.A pattern recognition approach is developed based on the methodological knowledge from manual procedure. The pattern recognition approach for this thesis work was developed and achieved by a non destructive testing method to identify the flaws in manually done condition monitoring of sleepers.In this method, a test vehicle is designed to capture sleeper images similar to visual inspection by human operator and the raw data for pattern recognition approach is provided from the captured images of the wooden sleepers. The data from the NDT method were further processed and appropriate features were extracted.The collection of data by the NDT method is to achieve high accuracy in reliable classification results. A key idea is to use the non supervised classifier based on the features extracted from the method to discriminate the condition of wooden sleepers in to either good or bad. Self organising map is used as classifier for the wooden sleeper classification.In order to achieve greater integration, the data collected by the machine vision system was made to interface with one another by a strategy called fusion. Data fusion was looked in at two different levels namely sensor-level fusion, feature- level fusion. As the goal was to reduce the accuracy of the human error on the rail sleeper classification as good or bad the results obtained by the feature-level fusion compared to that of the results of actual classification were satisfactory.
Resumo:
Parkinson's disease (PD) is a degenerative illness whose cardinal symptoms include rigidity, tremor, and slowness of movement. In addition to its widely recognized effects PD can have a profound effect on speech and voice.The speech symptoms most commonly demonstrated by patients with PD are reduced vocal loudness, monopitch, disruptions of voice quality, and abnormally fast rate of speech. This cluster of speech symptoms is often termed Hypokinetic Dysarthria.The disease can be difficult to diagnose accurately, especially in its early stages, due to this reason, automatic techniques based on Artificial Intelligence should increase the diagnosing accuracy and to help the doctors make better decisions. The aim of the thesis work is to predict the PD based on the audio files collected from various patients.Audio files are preprocessed in order to attain the features.The preprocessed data contains 23 attributes and 195 instances. On an average there are six voice recordings per person, By using data compression technique such as Discrete Cosine Transform (DCT) number of instances can be minimized, after data compression, attribute selection is done using several WEKA build in methods such as ChiSquared, GainRatio, Infogain after identifying the important attributes, we evaluate attributes one by one by using stepwise regression.Based on the selected attributes we process in WEKA by using cost sensitive classifier with various algorithms like MultiPass LVQ, Logistic Model Tree(LMT), K-Star.The classified results shows on an average 80%.By using this features 95% approximate classification of PD is acheived.This shows that using the audio dataset, PD could be predicted with a higher level of accuracy.
Resumo:
The rapid development of data transfer through internet made it easier to send the data accurate and faster to the destination. There are many transmission media to transfer the data to destination like e-mails; at the same time it is may be easier to modify and misuse the valuable information through hacking. So, in order to transfer the data securely to the destination without any modifications, there are many approaches like cryptography and steganography. This paper deals with the image steganography as well as with the different security issues, general overview of cryptography, steganography and digital watermarking approaches. The problem of copyright violation of multimedia data has increased due to the enormous growth of computer networks that provides fast and error free transmission of any unauthorized duplicate and possibly manipulated copy of multimedia information. In order to be effective for copyright protection, digital watermark must be robust which are difficult to remove from the object in which they are embedded despite a variety of possible attacks. The message to be send safe and secure, we use watermarking. We use invisible watermarking to embed the message using LSB (Least Significant Bit) steganographic technique. The standard LSB technique embed the message in every pixel, but my contribution for this proposed watermarking, works with the hint for embedding the message only on the image edges alone. If the hacker knows that the system uses LSB technique also, it cannot decrypt correct message. To make my system robust and secure, we added cryptography algorithm as Vigenere square. Whereas the message is transmitted in cipher text and its added advantage to the proposed system. The standard Vigenere square algorithm works with either lower case or upper case. The proposed cryptography algorithm is Vigenere square with extension of numbers also. We can keep the crypto key with combination of characters and numbers. So by using these modifications and updating in this existing algorithm and combination of cryptography and steganography method we develop a secure and strong watermarking method. Performance of this watermarking scheme has been analyzed by evaluating the robustness of the algorithm with PSNR (Peak Signal to Noise Ratio) and MSE (Mean Square Error) against the quality of the image for large amount of data. While coming to see results of the proposed encryption, higher value of 89dB of PSNR with small value of MSE is 0.0017. Then it seems the proposed watermarking system is secure and robust for hiding secure information in any digital system, because this system collect the properties of both steganography and cryptography sciences.
Resumo:
Background. Through a national policy agreement, over 167 million Euros will be invested in the Swedish National Quality Registries (NQRs) between 2012 and 2016. One of the policy agreement¿s intentions is to increase the use of NQR data for quality improvement (QI). However, the evidence is fragmented as to how the use of medical registries and the like lead to quality improvement, and little is known about non-clinical use. The aim was therefore to investigate the perspectives of Swedish politicians and administrators on quality improvement based on national registry data. Methods. Politicians and administrators from four county councils were interviewed. A qualitative content analysis guided by the Consolidated Framework for Implementation Research (CFIR) was performed. Results. The politicians and administrators perspectives on the use of NQR data for quality improvement were mainly assigned to three of the five CFIR domains. In the domain of intervention characteristics, data reliability and access in reasonable time were not considered entirely satisfactory, making it difficult for the politico-administrative leaderships to initiate, monitor, and support timely QI efforts. Still, politicians and administrators trusted the idea of using the NQRs as a base for quality improvement. In the domain of inner setting, the organizational structures were not sufficiently developed to utilize the advantages of the NQRs, and readiness for implementation appeared to be inadequate for two reasons. Firstly, the resources for data analysis and quality improvement were not considered sufficient at politico-administrative or clinical level. Secondly, deficiencies in leadership engagement at multiple levels were described and there was a lack of consensus on the politicians¿ role and level of involvement. Regarding the domain of outer setting, there was a lack of communication and cooperation between the county councils and the national NQR organizations. Conclusions. The Swedish experiences show that a government-supported national system of well-funded, well-managed, and reputable national quality registries needs favorable local politico-administrative conditions to be used for quality improvement; such conditions are not yet in place according to local politicians and administrators.
Resumo:
In this paper we present an approach to information flow analysis for a family of languages. We start with a simple imperative language. We present an information flow analysis using a flow logic. The paper contains detailed correctness proofs for this analysis. We next extend the analysis to a restricted form of Idealised Algol, a call-by-value higher-order extension of the simple imperative language (the key restriction being the lack of recursion). The paper concludes with a discussion of further extensions, including a probabilistic extension of Idealised Algol.
Resumo:
The idea for organizing a cooperative market on Waterville Main Street was proposed by Aime Schwartz in the fall of 2008. The Co-op would entail an open market located on Main Street to provide fresh, local produce and crafts to town locals. Through shorter delivery distances and agreements with local farmers, the co-op theoretically will offer consumers lower prices on produce than can be found in conventional grocery stores, as well as an opportunity to support local agriculture. One of the tasks involved with organizing the Co-op is to source all of the produce from among the hundreds of farmers located in Maine. The purpose of this project is to show how Geographic Information System (GIS) tools can be used to help the Co-op and other businesses a) site nearby farms that carry desired produce and products, and b) determine which farms are closest to the business site. Using GIS for this purpose will make it easier and more efficient to source produce suppliers, and reduce the workload on business planners. GIS Network Analyst is a tool that provides network-based spatial analysis, and can be used in conjunction with traditional GIS technologies to determine not only the geometric distance between points, but also distance over existing networks (like roads). We used Network Analyst to find the closest produce suppliers to the Co-op for specific produce items, and compute how far they are over existing roads. This will enable business planners to source potential suppliers by distance before contacting individual farmers, allowing for more efficient use of their time and a faster planning process.
Resumo:
A procedure for characterizing global uncertainty of a rainfall-runoff simulation model based on using grey numbers is presented. By using the grey numbers technique the uncertainty is characterized by an interval; once the parameters of the rainfall-runoff model have been properly defined as grey numbers, by using the grey mathematics and functions it is possible to obtain simulated discharges in the form of grey numbers whose envelope defines a band which represents the vagueness/uncertainty associated with the simulated variable. The grey numbers representing the model parameters are estimated in such a way that the band obtained from the envelope of simulated grey discharges includes an assigned percentage of observed discharge values and is at the same time as narrow as possible. The approach is applied to a real case study highlighting that a rigorous application of the procedure for direct simulation through the rainfall-runoff model with grey parameters involves long computational times. However, these times can be significantly reduced using a simplified computing procedure with minimal approximations in the quantification of the grey numbers representing the simulated discharges. Relying on this simplified procedure, the conceptual rainfall-runoff grey model is thus calibrated and the uncertainty bands obtained both downstream of the calibration process and downstream of the validation process are compared with those obtained by using a well-established approach, like the GLUE approach, for characterizing uncertainty. The results of the comparison show that the proposed approach may represent a valid tool for characterizing the global uncertainty associable with the output of a rainfall-runoff simulation model.
Brazilian international and inter-state trade flows: an exploratory analysis using the gravity model
Resumo:
Recent efforts toward a world with freer trade, like WTO/GATT or regional Preferential Trade Agreements(PTAs), were put in doubt after McCallum's(1995) finding of a large border effect between US and Canadian provinces. Since then, there has been a great amount of research on this topic employing the gravity equation. This dissertation has two goals. The first goal is to review comprehensively the recent literature about the gravity equation, including its usages, econometric specifications, and the efforts to provide it with microeconomic foundations. The second goal is the estimation of the Brazilian border effect (or 'home-bias trade puzzle') using inter-state and international trade flow data. It is used a pooled cross-section Tobit model. The lowest border effect estimated was 15, which implies that Brazilian states trade among themselves 15 times more than they trade with foreign countries. Further research using industry disaggregated data is needed to qualify the estimated border effect with respect to which part of that effect can be attributed to actual trade costs and which part is the outcome of the endogenous location problem of the firm.
Resumo:
AIRES, Kelson R. T.; ARAÚJO, Hélder J.; MEDEIROS, Adelardo A. D. Plane Detection Using Affine Homography. In: CONGRESSO BRASILEIRO DE AUTOMÁTICA, 2008, Juiz de Fora, MG: Anais... do CBA 2008.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The present study investigates Internet as technological interaction in the school environment as a resource of the teaching-learning process. It aims to discuss the lack of synchronicity between proposals of educational access for Internet use and types of access and interaction applied by youngsters. For the development of this research, I resorted to a qualitative, descriptive and explanatory research focused on a group whose subjects are youngsters from eleven to fifteen years of age in a catholic school which belongs to a group of private teaching schools in Natal city, Rio Grande do Norte state. As methodological option it focus on a group and the observation of its participation, discourse analysis and ethnography, considering facts and data of the pedagogical practice concerning the focused theme, besides the attempt to know the youngsters everyday at school and the relationship between them and juvenile cultures. It recognizes the existence of two moments of the focused group: the first related to internet use like technological interaction; the second concerns to the way Internet is problematic as technological interaction in classroom learning. While contacting with youngsters, the study discusses the concepts of Media Environments, Culture, Identity, Network, Consumption and Citizenship. It recognizes that it is relevant for the school to consider Internet a pedagogical tool, directed not just at research, but mostly as learning environment and as learning construction in a collaborative way. It points out the need of approach between school and media environment, reevaluating the pedagogical practice, offering a new evaluation proposal (self-evaluation). It suggests a renewal in the teacher's pedagogical practice in the classroom and using Internet, valuing the connection between technological interaction and communication as motivation elements of student s learning construction and their effective participation in decisions involving citizenship. It gives priority to educational work directed at the establishment of dialogic relationship between codes, learning and contents, leading to the new findings domain in the media environment, enabling the development of abilities and performances directed at the recognition and consumption of information from a critical reading of the media
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)