20 resultados para Gradient descent algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acute otitis media (AOM) is the most prevalent bacterial infection among children. Tympanometry and spectral gradient acoustic reflectometry (SG-AR) are adjunctive diagnostic tools to pneumatic otoscopy. The aim was to investigate the diagnostic accuracy and success rates of tympanometry and SG-AR performed by physicians and nurses. The study populations comprised 515 (I-II), 281 (III), and 156 (IV) outpatients (6-35 months). Physicians performed 4246 tympanometric (I) and SG-AR (II) examinations. Nurses performed 1782 (III) and 753 (IV) examinations at symptomatic and asymptomatic visits, respectively. Pneumatic otoscopy by the physician was the diagnostic standard. The accuracy of test results by physicians or nurses (I-IV) and the proportion of visits with accurate exclusive test results from both ears (III-IV) were analyzed. Type B tympanogram and SG-AR level 5 (<49˚) predicted middle ear effusion (MEE). At asymptomatic visits, type A and C1 tympanograms (peak pressure > -200 daPa) and SG-AR level 1 (>95˚) indicated healthy middle ear. Negative predictive values of type A and C1 tympanograms by nurses in excluding AOM at symptomatic and MEE at asymptomatic visits were 94% and 95%, respectively. Nurses obtained type A or C1 tympanogram from both ears at 94/459 (20%) and 81/196 (41%) of symptomatic and asymptomatic visits, respectively. SG-AR level 1 was rarely obtained from both ears. Type A and C1 tympanograms were accurate in excluding AOM at symptomatic and MEE at asymptomatic visits. However, nurses obtained these tympanograms from both ears only at one fifth of symptomatic visits and less than half of asymptomatic visits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many industrial applications need object recognition and tracking capabilities. The algorithms developed for those purposes are computationally expensive. Yet ,real time performance, high accuracy and small power consumption are essential measures of the system. When all these requirements are combined, hardware acceleration of these algorithms becomes a feasible solution. The purpose of this study is to analyze the current state of these hardware acceleration solutions, which algorithms have been implemented in hardware and what modifications have been done in order to adapt these algorithms to hardware.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simplification of highly detailed CAD models is an important step when CAD models are visualized or by other means utilized in augmented reality applications. Without simplification, CAD models may cause severe processing and storage is- sues especially in mobile devices. In addition, simplified models may have other advantages like better visual clarity or improved reliability when used for visual pose tracking. The geometry of CAD models is invariably presented in form of a 3D mesh. In this paper, we survey mesh simplification algorithms in general and focus especially to algorithms that can be used to simplify CAD models. We test some commonly known algorithms with real world CAD data and characterize some new CAD related simplification algorithms that have not been surveyed in previous mesh simplification reviews.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis introduces an extension of Chomsky’s context-free grammars equipped with operators for referring to left and right contexts of strings.The new model is called grammar with contexts. The semantics of these grammars are given in two equivalent ways — by language equations and by logical deduction, where a grammar is understood as a logic for the recursive definition of syntax. The motivation for grammars with contexts comes from an extensive example that completely defines the syntax and static semantics of a simple typed programming language. Grammars with contexts maintain most important practical properties of context-free grammars, including a variant of the Chomsky normal form. For grammars with one-sided contexts (that is, either left or right), there is a cubic-time tabular parsing algorithm, applicable to an arbitrary grammar. The time complexity of this algorithm can be improved to quadratic,provided that the grammar is unambiguous, that is, it only allows one parsefor every string it defines. A tabular parsing algorithm for grammars withtwo-sided contexts has fourth power time complexity. For these grammarsthere is a recognition algorithm that uses a linear amount of space. For certain subclasses of grammars with contexts there are low-degree polynomial parsing algorithms. One of them is an extension of the classical recursive descent for context-free grammars; the version for grammars with contexts still works in linear time like its prototype. Another algorithm, with time complexity varying from linear to cubic depending on the particular grammar, adapts deterministic LR parsing to the new model. If all context operators in a grammar define regular languages, then such a grammar can be transformed to an equivalent grammar without context operators at all. This allows one to represent the syntax of languages in a more succinct way by utilizing context specifications. Linear grammars with contexts turned out to be non-trivial already over a one-letter alphabet. This fact leads to some undecidability results for this family of grammars

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing performance of computers has made it possible to solve algorithmically problems for which manual and possibly inaccurate methods have been previously used. Nevertheless, one must still pay attention to the performance of an algorithm if huge datasets are used or if the problem iscomputationally difficult. Two geographic problems are studied in the articles included in this thesis. In the first problem the goal is to determine distances from points, called study points, to shorelines in predefined directions. Together with other in-formation, mainly related to wind, these distances can be used to estimate wave exposure at different areas. In the second problem the input consists of a set of sites where water quality observations have been made and of the results of the measurements at the different sites. The goal is to select a subset of the observational sites in such a manner that water quality is still measured in a sufficient accuracy when monitoring at the other sites is stopped to reduce economic cost. Most of the thesis concentrates on the first problem, known as the fetch length problem. The main challenge is that the two-dimensional map is represented as a set of polygons with millions of vertices in total and the distances may also be computed for millions of study points in several directions. Efficient algorithms are developed for the problem, one of them approximate and the others exact except for rounding errors. The solutions also differ in that three of them are targeted for serial operation or for a small number of CPU cores whereas one, together with its further developments, is suitable also for parallel machines such as GPUs.