942 resultados para errors and erasures decoding


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, technological advancements have brought industry and research towards the automation of various processes. Automation brings a reduction in costs and an improvement in product quality. For this reason, companies are pushing research to investigate new technologies. The agriculture industry has always looked towards automating various processes, from product processing to storage. In the last years, the automation of harvest and cultivation phases also has become attractive, pushed by the advancement of autonomous driving. Nevertheless, ADAS systems are not enough. Merging different technologies will be the solution to obtain total automation of agriculture processes. For example, sensors that estimate products' physical and chemical properties can be used to evaluate the maturation level of fruit. Therefore, the fusion of these technologies has a key role in industrial process automation. In this dissertation, ADAS systems and sensors for precision agriculture will be both treated. Several measurement procedures for characterizing commercial 3D LiDARs will be proposed and tested to cope with the growing need for comparison tools. Axial errors and transversal errors have been investigated. Moreover, a measurement method and setup for evaluating the fog effect on 3D LiDARs will be proposed. Each presented measurement procedure has been tested. The obtained results highlight the versatility and the goodness of the proposed approaches. Regarding the precision agriculture sensors, a measurement approach for the Moisture Content and density estimation of crop directly on the field is presented. The approach regards the employment of a Near Infrared spectrometer jointly with Partial Least Square statistical analysis. The approach and the model will be described together with a first laboratory prototype used to evaluate the NIRS approach. Finally, a prototype for on the field analysis is realized and tested. The test results are promising, evidencing that the proposed approach is suitable for Moisture Content and density estimation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vision systems are powerful tools playing an increasingly important role in modern industry, to detect errors and maintain product standards. With the enlarged availability of affordable industrial cameras, computer vision algorithms have been increasingly applied in industrial manufacturing processes monitoring. Until a few years ago, industrial computer vision applications relied only on ad-hoc algorithms designed for the specific object and acquisition setup being monitored, with a strong focus on co-designing the acquisition and processing pipeline. Deep learning has overcome these limits providing greater flexibility and faster re-configuration. In this work, the process to be inspected consists in vials’ pack formation entering a freeze-dryer, which is a common scenario in pharmaceutical active ingredient packaging lines. To ensure that the machine produces proper packs, a vision system is installed at the entrance of the freeze-dryer to detect eventual anomalies with execution times compatible with the production specifications. Other constraints come from sterility and safety standards required in pharmaceutical manufacturing. This work presents an overview about the production line, with particular focus on the vision system designed, and about all trials conducted to obtain the final performance. Transfer learning, alleviating the requirement for a large number of training data, combined with data augmentation methods, consisting in the generation of synthetic images, were used to effectively increase the performances while reducing the cost of data acquisition and annotation. The proposed vision algorithm is composed by two main subtasks, designed respectively to vials counting and discrepancy detection. The first one was trained on more than 23k vials (about 300 images) and tested on 5k more (about 75 images), whereas 60 training images and 52 testing images were used for the second one.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last few years there has been a great development of techniques like quantum computers and quantum communication systems, due to their huge potentialities and the growing number of applications. However, physical qubits experience a lot of nonidealities, like measurement errors and decoherence, that generate failures in the quantum computation. This work shows how it is possible to exploit concepts from classical information in order to realize quantum error-correcting codes, adding some redundancy qubits. In particular, the threshold theorem states that it is possible to lower the percentage of failures in the decoding at will, if the physical error rate is below a given accuracy threshold. The focus will be on codes belonging to the family of the topological codes, like toric, planar and XZZX surface codes. Firstly, they will be compared from a theoretical point of view, in order to show their advantages and disadvantages. The algorithms behind the minimum perfect matching decoder, the most popular for such codes, will be presented. The last section will be dedicated to the analysis of the performances of these topological codes with different error channel models, showing interesting results. In particular, while the error correction capability of surface codes decreases in presence of biased errors, XZZX codes own some intrinsic symmetries that allow them to improve their performances if one kind of error occurs more frequently than the others.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the field of industrial automation, there is an increasing need to use optimal control systems that have low tracking errors and low power and energy consumption. The motors we are dealing with are mainly Permanent Magnet Synchronous Motors (PMSMs), controlled by 3 different types of controllers: a position controller, a speed controller, and a current controller. In this thesis, therefore, we are going to act on the gains of the first two controllers by going to find, through the TwinCAT 3 software, what might be the best set of parameters. To do this, starting with the default parameters recommended by TwinCAT, two main methods were used and then compared: the method of Ziegler and Nichols, which is a tabular method, and advanced tuning, an auto-tuning software method of TwinCAT. Therefore, in order to analyse which set of parameters was the best,several experiments were performed for each case, using the Motion Control Function Blocks. Moreover, some machines, such as large robotic arms, have vibration problems. To analyse them in detail, it was necessary to use the Bode Plot tool, which, through Bode plots, highlights in which frequencies there are resonance and anti-resonance peaks. This tool also makes it easier to figure out which and where to apply filters to improve control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Universidade Estadual de Campinas. Faculdade de Educação Física

Relevância:

100.00% 100.00%

Publicador:

Resumo:

No ano de 2003 Francisco de Oliveria publicou um artigo intitulado "O Ornitorrinco" no qual fez considerações críticas sobre a conjectura politico-social daquele momento histórico. Tal artigo é permeado por um paralelo entre o evolucionismo darwinista e a visão do autor sobre a sociedade brasileira contemporânea. Entretanto, ao fazer tal analogia ele incorre numa série de equívocos teóricos sobre a teoria evolucionista. Tais equívocos consistem, em grande parte, numa substuição indevida entre aquilo que ficou conhecido como Darwinismo Social e a teoria neodarwinista como entendida pelos seus atuais proponentes. O presente trabalho identifica estes equívocos e os contextualiza dentro da teoria neodarwiniana. Além disso, fazemos um recorte histórico do processo de formação do pensamento evolucionista para enfatizar que a associação entre biologia e darwinismo social é mais complexa do que geralmente se assume.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The VISTA near infrared survey of the Magellanic System (VMC) will provide deep YJK(s) photometry reaching stars in the oldest turn-off point throughout the Magellanic Clouds (MCs). As part of the preparation for the survey, we aim to access the accuracy in the star formation history (SFH) that can be expected from VMC data, in particular for the Large Magellanic Cloud (LMC). To this aim, we first simulate VMC images containing not only the LMC stellar populations but also the foreground Milky Way (MW) stars and background galaxies. The simulations cover the whole range of density of LMC field stars. We then perform aperture photometry over these simulated images, access the expected levels of photometric errors and incompleteness, and apply the classical technique of SFH-recovery based on the reconstruction of colour-magnitude diagrams (CMD) via the minimisation of a chi-squared-like statistics. We verify that the foreground MW stars are accurately recovered by the minimisation algorithms, whereas the background galaxies can be largely eliminated from the CMD analysis due to their particular colours and morphologies. We then evaluate the expected errors in the recovered star formation rate as a function of stellar age, SFR(t), starting from models with a known age-metallicity relation (AMR). It turns out that, for a given sky area, the random errors for ages older than similar to 0.4 Gyr seem to be independent of the crowding. This can be explained by a counterbalancing effect between the loss of stars from a decrease in the completeness and the gain of stars from an increase in the stellar density. For a spatial resolution of similar to 0.1 deg(2), the random errors in SFR(t) will be below 20% for this wide range of ages. On the other hand, due to the lower stellar statistics for stars younger than similar to 0.4 Gyr, the outer LMC regions will require larger areas to achieve the same level of accuracy in the SFR( t). If we consider the AMR as unknown, the SFH-recovery algorithm is able to accurately recover the input AMR, at the price of an increase of random errors in the SFR(t) by a factor of about 2.5. Experiments of SFH-recovery performed for varying distance modulus and reddening indicate that these parameters can be determined with (relative) accuracies of Delta(m-M)(0) similar to 0.02 mag and Delta E(B-V) similar to 0.01 mag, for each individual field over the LMC. The propagation of these errors in the SFR(t) implies systematic errors below 30%. This level of accuracy in the SFR(t) can reveal significant imprints in the dynamical evolution of this unique and nearby stellar system, as well as possible signatures of the past interaction between the MCs and the MW.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using a sample of 68.3x10(6) K(L)->pi(0)pi(0)pi(0) decays collected in 1996-1999 by the KTeV (E832) experiment at Fermilab, we present a detailed study of the K(L)->pi(0)pi(0)pi(0) Dalitz plot density. We report the first observation of interference from K(L)->pi(+)pi(-)pi(0) decays in which pi(+)pi(-) rescatters to pi(0)pi(0) in a final-state interaction. This rescattering effect is described by the Cabibbo-Isidori model, and it depends on the difference in pion scattering lengths between the isospin I=0 and I=2 states, a(0)-a(2). Using the Cabibbo-Isidori model, and fixing (a(0)-a(2))m(pi)(+)=0.268 +/- 0.017 as measured by the CERN-NA48 collaboration, we present the first measurement of the K(L)->pi(0)pi(0)pi(0) quadratic slope parameter that accounts for the rescattering effect: h(000)=(+0.59 +/- 0.20(stat)+/- 0.48(syst)+/- 1.06(ext))x10(-3), where the uncertainties are from data statistics, KTeV systematic errors, and external systematic errors. Fitting for both h(000) and a(0)-a(2), we find h(000)=(-2.09 +/- 0.62(stat)+/- 0.72(syst)+/- 0.28(ext))x10(-3), and m(pi)(+)(a(0)-a(2))=0.215 +/- 0.014(stat)+/- 0.025(syst)+/- 0.006(ext); our value for a(0)-a(2) is consistent with that from NA48.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a framework to build medical training applications by using virtual reality and a tool that helps the class instantiation of this framework. The main purpose is to make easier the building of virtual reality applications in the medical training area, considering systems to simulate biopsy exams and make available deformation, collision detection, and stereoscopy functionalities. The instantiation of the classes allows quick implementation of the tools for such a purpose, thus reducing errors and offering low cost due to the use of open source tools. Using the instantiation tool, the process of building applications is fast and easy. Therefore, computer programmers can obtain an initial application and adapt it to their needs. This tool allows the user to include, delete, and edit parameters in the functionalities chosen as well as storing these parameters for future use. In order to verify the efficiency of the framework, some case studies are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study the hypothesis that interceptive movements are controlled on the basis of expectancy of time to target arrival was tested. The study was conducted through assessment of temporal errors and kinematics of interceptive movements to a moving virtual target. Initial target velocity was kept unchanged in part of the trials, and in the others it was decreased 300 ms before the due time of target arrival at the interception position, increasing in 100 ms time to target arrival. Different probabilities of velocity decrease ranging from 25 to 100% were compared. The results revealed that while there were increasing errors between probabilities of 25 and 75% for unchanged target velocity, the opposite relationship was observed for target velocity decrease. Kinematic analysis indicated that movement timing adjustments to target velocity decrease were made online. These results support the conception that visuomotor integration in the interception of moving targets is mediated by an internal forward model whose weights can be flexibly adjusted according to expectancy of time to target arrival.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This investigation aimed at assessing the extent to which memory from practice in a specific condition of target displacement modulates temporal errors and movement timing of interceptive movements. We compared two groups practicing with certainty of future target velocity either in unchanged target velocity or in target velocity decrease. Following practice, both experimental groups were probed in the situations of unchanged target velocity and target velocity decrease either under the context of certainty or uncertainty about target velocity. Results from practice showed similar improvement of temporal accuracy between groups, revealing that target velocity decrease did not disturb temporal movement organization when fully predictable. Analysis of temporal errors in the probing trials indicated that both groups had higher timing accuracy in velocity decrease in comparison with unchanged velocity. Effect of practice was detected by increased temporal accuracy of the velocity decrease group in situations of decreased velocity; a trend consistent with the expected effect of practice was observed for temporal errors in the unchanged velocity group and in movement initiation at a descriptive level. An additional point of theoretical interest was the fast adaptation in both groups to a target velocity pattern different from that practiced. These points are discussed under the perspective of integration of vision and motor control by means of an internal forward model of external motion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Use of peripheral vision to organize and reorganize an interceptive action was investigated in young adults. Temporal errors and kinematic variables were evaluated in the interception of a virtual moving target, in situations in which its initial velocity was kept unchanged or was unexpectedly decreased. Observation of target approach was made through continuous visual pursuit (focal vision) or keeping visual focus at the origin of the trajectory or at the contact spot (peripheral vision). Results showed that visual focus at the contact spot led to temporal errors similar to focal vision, although showing a distinct kinematic profile, while focus at the origin led to an impoverished performance

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are quite a few typographical errors and omissions in Michael Lattke's ARC-funded German commentary on the Odes of Solomon. Most of them were discovered by Marianne Ehrhardt while she translated the manuscript into English for the Hermeneia commentary series: Lattke, Michael (2009). Odes of Solomon : a commentary. Fortress Press, Minneapolis, MN.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: Ligation of the sphenopalatine artery is used to treat severe nasal haemorrhage. The objective of the study was to investigate the numerical variation of the sphenopalatine foramen (SPF), its relation to the ethmoid bone crista of the palatine bone in the lateral nasal wall, its distance from the anterior nasal spine and the angle between this distance and the anterior nasal floor. Material and Methods: Fifty-four hemiskulls were submitted to anatomical study and measurements using the Image Tool 3.0 software. Results: The SPF was single in 87% of the specimens, and more than one orifice was present in 13%. It was possible to establish a relation with the ethmoid crista, which is a surgical reference for the SIT location. The mean values of the measurements were significantly higher in the hemifaces than in the hemiskulls ranging from 54 to 63mm, and angulation ranged from 20 to 32 degrees. Conclusions: In most specimens studied, the SPF was single and located in the superior nasal meatus. The distances measured suggest that these values can be used as distance references for the use of the endoscope for ligation or endonasal cauterization of the branches of the sphenopalatine artery, preventing possible errors and complications.