914 resultados para Errors and blunders, Literary.
Resumo:
The ability of neural networks to realize some complex nonlinear function makes them attractive for system identification. This paper describes a novel method using artificial neural networks to solve robust parameter estimation problems for nonlinear models with unknown-but-bounded errors and uncertainties. More specifically, a modified Hopfield network is developed and its internal parameters are computed using the valid-subspace technique. These parameters guarantee the network convergence to the equilibrium points. A solution for the robust estimation problem with unknown-but-bounded error corresponds to an equilibrium point of the network. Simulation results are presented as an illustration of the proposed approach.
Resumo:
New formularizations, techniques and devices have become the dental whitening most safe and with better results. Although this, the verification of the levels whitening is being continued for visual comparison, that is an empirical, subjective method, subject to errors and dependent of the individual interpretation. Normally the result of the whitening is express for the amplitude of displacement between the initial and the final color, being take like reference the tonalities of a scale of color commanded of darkest for more clearly. Although to be the most used scale, the ordinance of the Vita Classical (R) - Vita, according to recommendations of the manufacturer, reveals inadequate for the evaluation of the whitening. From digital images and of algorithm OER (ordinance of the reference scale), especially developed for the ScanWhite (C), the ordinance of the tonalities of the scale Vita Classical (R) was made. For such, the values of the canals of color R, G, and B of medium part average of the crowns was adopted as reference for evaluation. The images had been taken with the camera Sony Cybershoot DSC F828. The results of the computational ordinance had been compared with the sequence proposal for the manufacturer and with the earned one for the visual evaluation, carried through by 10 volunteers, under standardized conditions of illumination. It statistics analyzes demonstrated significant differences between the ordinances.
Resumo:
Alternant codes over arbitrary finite commutative local rings with identity are constructed in terms of parity-check matrices. The derivation is based on the factorization of x s - 1 over the unit group of an appropriate extension of the finite ring. An efficient decoding procedure which makes use of the modified Berlekamp-Massey algorithm to correct errors and erasures is presented. Furthermore, we address the construction of BCH codes over Zm under Lee metric.
Resumo:
This paper by R. E. Catai, E. C. Bianchi, P. R de Águia and M. C. Alves reports on the results of an analysis made of roundness errors, residual stresses, and SEM micrographs of VC131 steel. The analysis involved workpieces ground with two types of cutting fluid: synthetic cutting fluid and emulsive oil. In this study, the cutting parameters were kept constant while the type of cutting fluid was varied. The amount of cutting fluid injected in the process was also varied, aiming to identify the ideal amount required to obtain good results without causing structural damage to the workpiece. The SEM analyses of roundness errors and residual stresses revealed that, of the two cutting fluids, emulsive oil provided better tensions due to its greater lubricating power.
Resumo:
In the paper we discuss the potential of the new Galileo signals for pseudorange based surveying and mapping in open areas under optimal reception conditions (open sky scenarios) and suboptimal ones (multipath created by moderate to thick tree coverage). The paper reviews the main features of the Galileo E5 AltBOC and E1 CBOC signals; describes the simulation strategy, models and algorithms to generate realistic E5 and E1 pseudoranges with and without multipath sources; describes the ionosphere modeling strategy, models and algorithms and discusses and presents the expected positioning accuracy and precision results. According to the simulations performed, pseudoranges can be extracted from the Galileo E5 AltBOC signals with tracking errors (1-σ level) ranging from 0.02 m (open sky scenarios) to 0.08 m (tree covered scenarios) whereas for the Galileo E1 CBOC signals the tracking errors range between 0.25 m to 2.00 m respectively. With these tracking errors and with the explicit estimation of the ionosphere parameters, simulations indicate real-time open sky cm-level horizontal positioning precisions and dm-level vertical ones and dm-level accuracies for both the horizontal and vertical position components.
Resumo:
Butyltin (BTs) quantification in environmental matrices can be affected by interfering species found primarily in complex matrices,such as sediment and biota tissues. This study investigated matrix effects in analytical procedures for butyltin (TBT,DBT and MBT) quantification and speciation in sediments and in two fish tissues (gill and liver) by gas chromatography with pulsed flame photometric detection (GC-PFPD) analysis. Unlike sediment samples,tissues exhibited a significant matrix effect,thus,the quantification should be made by curve over matrix to avoid quantification errors and loss of analytical accuracy. © 2013 Sociedade Brasileira de Química.
Resumo:
To assist cattle producers transition from microsatellite (MS) to single nucleotide polymorphism (SNP) genotyping for parental verification we previously devised an effective and inexpensive method to impute MS alleles from SNP haplotypes. While the reported method was verified with only a limited data set (N = 479) from Brown Swiss, Guernsey, Holstein, and Jersey cattle, some of the MS-SNP haplotype associations were concordant across these phylogenetically diverse breeds. This implied that some haplotypes predate modern breed formation and remain in strong linkage disequilibrium. To expand the utility of MS allele imputation across breeds, MS and SNP data from more than 8000 animals representing 39 breeds (Bos taurus and B. indicus) were used to predict 9410 SNP haplotypes, incorporating an average of 73 SNPs per haplotype, for which alleles from 12 MS markers could be accurately be imputed. Approximately 25% of the MS-SNP haplotypes were present in multiple breeds (N = 2 to 36 breeds). These shared haplotypes allowed for MS imputation in breeds that were not represented in the reference population with only a small increase in Mendelian inheritance inconsistancies. Our reported reference haplotypes can be used for any cattle breed and the reported methods can be applied to any species to aid the transition from MS to SNP genetic markers. While ~91% of the animals with imputed alleles for 12 MS markers had ≤1 Mendelian inheritance conflicts with their parents' reported MS genotypes, this figure was 96% for our reference animals, indicating potential errors in the reported MS genotypes. The workflow we suggest autocorrects for genotyping errors and rare haplotypes, by MS genotyping animals whose imputed MS alleles fail parentage verification, and then incorporating those animals into the reference dataset.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
In this action research study, I investigated the careless errors made by my seventh-grade mathematics students on their homework and tests. Beyond analyzing the types of careless errors and the frequency at which they were made, I also analyzed my students’ attitudes toward reviewing their work before they turn it in and self-reflection about the quality of work that they were producing. I found that many students did not know how to review their test before turning it in; no one had ever taught them how to do so. However, when students were given tools to help them with this task, they were able to make strides towards reducing the number of careless errors that they made and began to turn in high quality work that demonstrated their understanding of the content that had been taught. As a result of this research, I plan to teach my students how to go back over their homework and tests before turning them in. I also intend to continue to use the tools that I have produced to encourage students to self-reflect on the work that they have done. Assessment is such an important piece of educating my students and the careless errors made on these assessments needed to be addressed.
Resumo:
The flow around circular smooth fixed cylinder in a large range of Reynolds numbers is considered in this paper. In order to investigate this canonical case, we perform CFD calculations and apply verification & validation (V&V) procedures to draw conclusions regarding numerical error and, afterwards, assess the modeling errors and capabilities of this (U)RANS method to solve the problem. Eight Reynolds numbers between Re = 10 and Re 5 x 10(5) will be presented with, at least, four geometrically similar grids and five discretization in time for each case (when unsteady), together with strict control of iterative and round-off errors, allowing a consistent verification analysis with uncertainty estimation. Two-dimensional RANS, steady or unsteady, laminar or turbulent calculations are performed. The original 1994 k - omega SST turbulence model by Menter is used to model turbulence. The validation procedure is performed by comparing the numerical results with an extensive set of experimental results compiled from the literature. [DOI: 10.1115/1.4007571]
Resumo:
Abstract Background Considering the increasing use of polymyxins to treat infections due to multidrug resistant Gram-negative in many countries, it is important to evaluate different susceptibility testing methods to this class of antibiotic. Methods Susceptibility of 109 carbapenem-resistant P. aeruginosa to polymyxins was tested comparing broth microdilution (reference method), disc diffusion, and Etest using the new interpretative breakpoints of Clinical and Laboratory Standards Institute. Results Twenty-nine percent of isolates belonged to endemic clone and thus, these strains were excluded of analysis. Among 78 strains evaluated, only one isolate was resistant to polymyxin B by the reference method (MIC: 8.0 μg/mL). Very major and major error rates of 1.2% and 11.5% were detected comparing polymyxin B disc diffusion with the broth microdilution (reference method). Agreement within 1 twofold dilution between Etest and the broth microdilution were 33% for polymyxin B and 79.5% for colistin. One major error and 48.7% minor errors were found comparing polymyxin B Etest with broth microdilution and only 6.4% minor errors with colistin. The concordance between Etest and the broth microdilution (reference method) was respectively 100% for colistin and 90% for polymyxin B. Conclusion Resistance to polymyxins seems to be rare among hospital carbapenem-resistant P. aeruginosa isolates over a six-year period. Our results showed, using the new CLSI criteria, that the disc diffusion susceptibility does not report major errors (false-resistant results) for colistin. On the other hand, showed a high frequency of minor errors and 1 very major error for polymyxin B. Etest presented better results for colistin than polymyxin B. Until these results are reproduced with a large number of polymyxins-resistant P. aeruginosa isolates, susceptibility to polymyxins should be confirmed by a reference method.
Resumo:
Marine soft bottom systems show a high variability across multiple spatial and temporal scales. Both natural and anthropogenic sources of disturbance act together in affecting benthic sedimentary characteristics and species distribution. The description of such spatial variability is required to understand the ecological processes behind them. However, in order to have a better estimate of spatial patterns, methods that take into account the complexity of the sedimentary system are required. This PhD thesis aims to give a significant contribution both in improving the methodological approaches to the study of biological variability in soft bottom habitats and in increasing the knowledge of the effect that different process (both natural and anthropogenic) could have on the benthic communities of a large area in the North Adriatic Sea. Beta diversity is a measure of the variability in species composition, and Whittaker’s index has become the most widely used measure of beta-diversity. However, application of the Whittaker index to soft bottom assemblages of the Adriatic Sea highlighted its sensitivity to rare species (species recorded in a single sample). This over-weighting of rare species induces biased estimates of the heterogeneity, thus it becomes difficult to compare assemblages containing a high proportion of rare species. In benthic communities, the unusual large number of rare species is frequently attributed to a combination of sampling errors and insufficient sampling effort. In order to reduce the influence of rare species on the measure of beta diversity, I have developed an alternative index based on simple probabilistic considerations. It turns out that this probability index is an ordinary Michaelis-Menten transformation of Whittaker's index but behaves more favourably when species heterogeneity increases. The suggested index therefore seems appropriate when comparing patterns of complexity in marine benthic assemblages. Although the new index makes an important contribution to the study of biodiversity in sedimentary environment, it remains to be seen which processes, and at what scales, influence benthic patterns. The ability to predict the effects of ecological phenomena on benthic fauna highly depends on both spatial and temporal scales of variation. Once defined, implicitly or explicitly, these scales influence the questions asked, the methodological approaches and the interpretation of results. Problem often arise when representative samples are not taken and results are over-generalized, as can happen when results from small-scale experiments are used for resource planning and management. Such issues, although globally recognized, are far from been resolved in the North Adriatic Sea. This area is potentially affected by both natural (e.g. river inflow, eutrophication) and anthropogenic (e.g. gas extraction, fish-trawling) sources of disturbance. Although few studies in this area aimed at understanding which of these processes mainly affect macrobenthos, these have been conducted at a small spatial scale, as they were designated to examine local changes in benthic communities or particular species. However, in order to better describe all the putative processes occurring in the entire area, a high sampling effort performed at a large spatial scale is required. The sedimentary environment of the western part of the Adriatic Sea was extensively studied in this thesis. I have described, in detail, spatial patterns both in terms of sedimentary characteristics and macrobenthic organisms and have suggested putative processes (natural or of human origin) that might affect the benthic environment of the entire area. In particular I have examined the effect of off shore gas platforms on benthic diversity and tested their effect over a background of natural spatial variability. The results obtained suggest that natural processes in the North Adriatic such as river outflow and euthrophication show an inter-annual variability that might have important consequences on benthic assemblages, affecting for example their spatial pattern moving away from the coast and along a North to South gradient. Depth-related factors, such as food supply, light, temperature and salinity play an important role in explaining large scale benthic spatial variability (i.e., affecting both the abundance patterns and beta diversity). Nonetheless, more locally, effects probably related to an organic enrichment or pollution from Po river input has been observed. All these processes, together with few human-induced sources of variability (e.g. fishing disturbance), have a higher effect on macrofauna distribution than any effect related to the presence of gas platforms. The main effect of gas platforms is restricted mainly to small spatial scales and related to a change in habitat complexity due to a natural dislodgement or structure cleaning of mussels that colonize their legs. The accumulation of mussels on the sediment reasonably affects benthic infauna composition. All the components of the study presented in this thesis highlight the need to carefully consider methodological aspects related to the study of sedimentary habitats. With particular regards to the North Adriatic Sea, a multi-scale analysis along natural and anthopogenic gradients was useful for detecting the influence of all the processes affecting the sedimentary environment. In the future, applying a similar approach may lead to an unambiguous assessment of the state of the benthic community in the North Adriatic Sea. Such assessment may be useful in understanding if any anthropogenic source of disturbance has a negative effect on the marine environment, and if so, planning sustainable strategies for a proper management of the affected area.
Resumo:
This thesis is focused on the study of techniques that allow to have reliable transmission of multimedia content in streaming and broadcasting applications, targeting in particular video content. The design of efficient error-control mechanisms, to enhance video transmission systems reliability, has been addressed considering cross-layer and multi-layer/multi-dimensional channel coding techniques to cope with bit errors as well as packet erasures. Mechanisms for unequal time interleaving have been designed as a viable solution to reduce the impact of errors and erasures by acting on the time diversity of the data flow, thus enhancing robustness against correlated channel impairments. In order to account for the nature of the factors which affect the physical layer channel in the evaluation of FEC schemes performances, an ad-hoc error-event modeling has been devised. In addition, the impact of error correction/protection techniques on the quality perceived by the consumers of video services applications and techniques for objective/subjective quality evaluation have been studied. The applicability and value of the proposed techniques have been tested by considering practical constraints and requirements of real system implementations.
Resumo:
This thesis investigates interactive scene reconstruction and understanding using RGB-D data only. Indeed, we believe that depth cameras will still be in the near future a cheap and low-power 3D sensing alternative suitable for mobile devices too. Therefore, our contributions build on top of state-of-the-art approaches to achieve advances in three main challenging scenarios, namely mobile mapping, large scale surface reconstruction and semantic modeling. First, we will describe an effective approach dealing with Simultaneous Localization And Mapping (SLAM) on platforms with limited resources, such as a tablet device. Unlike previous methods, dense reconstruction is achieved by reprojection of RGB-D frames, while local consistency is maintained by deploying relative bundle adjustment principles. We will show quantitative results comparing our technique to the state-of-the-art as well as detailed reconstruction of various environments ranging from rooms to small apartments. Then, we will address large scale surface modeling from depth maps exploiting parallel GPU computing. We will develop a real-time camera tracking method based on the popular KinectFusion system and an online surface alignment technique capable of counteracting drift errors and closing small loops. We will show very high quality meshes outperforming existing methods on publicly available datasets as well as on data recorded with our RGB-D camera even in complete darkness. Finally, we will move to our Semantic Bundle Adjustment framework to effectively combine object detection and SLAM in a unified system. Though the mathematical framework we will describe does not restrict to a particular sensing technology, in the experimental section we will refer, again, only to RGB-D sensing. We will discuss successful implementations of our algorithm showing the benefit of a joint object detection, camera tracking and environment mapping.
Resumo:
In the past two decades the work of a growing portion of researchers in robotics focused on a particular group of machines, belonging to the family of parallel manipulators: the cable robots. Although these robots share several theoretical elements with the better known parallel robots, they still present completely (or partly) unsolved issues. In particular, the study of their kinematic, already a difficult subject for conventional parallel manipulators, is further complicated by the non-linear nature of cables, which can exert only efforts of pure traction. The work presented in this thesis therefore focuses on the study of the kinematics of these robots and on the development of numerical techniques able to address some of the problems related to it. Most of the work is focused on the development of an interval-analysis based procedure for the solution of the direct geometric problem of a generic cable manipulator. This technique, as well as allowing for a rapid solution of the problem, also guarantees the results obtained against rounding and elimination errors and can take into account any uncertainties in the model of the problem. The developed code has been tested with the help of a small manipulator whose realization is described in this dissertation together with the auxiliary work done during its design and simulation phases.