838 resultados para Automated algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our objective is to evaluate the accuracy of three algorithms in differentiating the origins of outflow tract ventricular arrhythmias (OTVAs). This study involved 110 consecutive patients with OTVAs for whom a standard 12-lead surface electrocardiogram (ECG) showed typical left bundle branch block morphology with an inferior axis. All the ECG tracings were retrospectively analyzed using the following three recently published ECG algorithms: 1) the transitional zone (TZ) index, 2) the V2 transition ratio, and 3) V2 R wave duration and R/S wave amplitude indices. Considering all patients, the V2 transition ratio had the highest sensitivity (92.3%), while the R wave duration and R/S wave amplitude indices in V2 had the highest specificity (93.9%). The latter finding had a maximal area under the ROC curve of 0.925. In patients with left ventricular (LV) rotation, the V2 transition ratio had the highest sensitivity (94.1%), while the R wave duration and R/S wave amplitude indices in V2 had the highest specificity (87.5%). The former finding had a maximal area under the ROC curve of 0.892. All three published ECG algorithms are effective in differentiating the origin of OTVAs, while the V2 transition ratio, and the V2 R wave duration and R/S wave amplitude indices are the most sensitive and specific algorithms, respectively. Amongst all of the patients, the V2 R wave duration and R/S wave amplitude algorithm had the maximal area under the ROC curve, but in patients with LV rotation the V2 transition ratio algorithm had the maximum area under the ROC curve.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many industrial applications need object recognition and tracking capabilities. The algorithms developed for those purposes are computationally expensive. Yet ,real time performance, high accuracy and small power consumption are essential measures of the system. When all these requirements are combined, hardware acceleration of these algorithms becomes a feasible solution. The purpose of this study is to analyze the current state of these hardware acceleration solutions, which algorithms have been implemented in hardware and what modifications have been done in order to adapt these algorithms to hardware.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tutkimuksen tavoitteena oli selvittää automaattisten tilausjärjestelmien onnistuneen käyttöönottoon taustalla vaikuttavia tekijöitä vähittäiskaupan toimialalla ja etsiä ratkaisua kyseisten järjestelmien onnistuneeseen käyttöönottoon tässä ympäristössä. Tutkimus analysoi yli sadan kaupan järjestelmän käyttöönottoa ja käyttöönoton tuloksia. Tutkimusta varten haastateltiin niin yhtiön sisältä kuin ulkopuoleltakin mukana olleita hankintajärjestelmän ja jalkautuksen asiantuntijoita. Tämän lisäksi järjestelmän käyttöönottaneisiin kauppoihin lähetettiin kyselyt, joita analysoitiin ryhmissä automaattisen tilausjärjestelmän tietojen pohjalta. Työn tuloksena pystyttiin tunnistamaan tietty joukko taustatekijöitä, jotka tulee ottaa käyttöönotossa huomioon sekä saatuihin tutkimustuloksiin perustuen kehitettiin malli vastaavanlaisten järjestelmien käyttöönotolle vähittäiskaupan alalle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simplification of highly detailed CAD models is an important step when CAD models are visualized or by other means utilized in augmented reality applications. Without simplification, CAD models may cause severe processing and storage is- sues especially in mobile devices. In addition, simplified models may have other advantages like better visual clarity or improved reliability when used for visual pose tracking. The geometry of CAD models is invariably presented in form of a 3D mesh. In this paper, we survey mesh simplification algorithms in general and focus especially to algorithms that can be used to simplify CAD models. We test some commonly known algorithms with real world CAD data and characterize some new CAD related simplification algorithms that have not been surveyed in previous mesh simplification reviews.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research is looking to find out what benefits employees expect the organization of data governance gains for an organization and how it benefits implementing automated marketing capabilities. Quality and usability of the data are crucial for organizations to meet various business needs. Organizations have more data and technology available what can be utilized for example in automated marketing. Data governance addresses the organization of decision rights and accountabilities for the management of an organization’s data assets. With automated marketing it is meant sending a right message, to a right person, at a right time, automatically. The research is a single case study conducted in Finnish ICT-company. The case company was starting to organize data governance and implementing automated marketing capabilities at the time of the research. Empirical material is interviews of the employees of the case company. Content analysis is used to interpret the interviews in order to find the answers to the research questions. Theoretical framework of the research is derived from the morphology of data governance. Findings of the research indicate that the employees expect the organization of data governance among others to improve customer experience, to improve sales, to provide abilities to identify individual customer’s life-situation, ensure that the handling of the data is according to the regulations and improve operational efficiency. The organization of data governance is expected to solve problems in customer data quality that are currently hindering implementation of automated marketing capabilities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract The growing interest in the usage of dietary fiber in food has caused the need to provide precise tools for describing its physical properties. This research examined two dietary fibers from oats and beets, respectively, in variable particle sizes. The application of automated static image analysis for describing the hydration properties and particle size distribution of dietary fiber was analyzed. Conventional tests for water holding capacity (WHC) were conducted. The particles were measured at two points: dry and after water soaking. The most significant water holding capacity (7.00 g water/g solid) was achieved by the smaller sized oat fiber. Conversely, the water holding capacity was highest (4.20 g water/g solid) in larger sized beet fiber. There was evidence for water absorption increasing with a decrease in particle size in regards to the same fiber source. Very strong correlations were drawn between particle shape parameters, such as fiber length, straightness, width and hydration properties measured conventionally. The regression analysis provided the opportunity to estimate whether the automated static image analysis method could be an efficient tool in describing the hydration properties of dietary fiber. The application of the method was validated using mathematical model which was verified in comparison to conventional WHC measurement results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Sugarcane monosaccharides are reducing sugars, and classical analytical methodologies (Lane-Eynon, Benedict, complexometric-EDTA, Luff-Schoorl, Musson-Walker, Somogyi-Nelson) are based on reducing copper ions in alkaline solutions. In Brazil, certain factories use Lane-Eynon, others use the equipment referred to as “REDUTEC”, and additional factories analyze reducing sugars based on a mathematic model. The objective of this paper is to understand the relationship between variations in millivolts, mass and tenors of reducing sugars during the analysis process. Another objective is to generate an automatic model for this process. The work herein uses the equipment referred to as “REDUTEC”, a digital balance, a peristaltic pump, a digital camcorder, math programs and graphics programs. We conclude that the millivolts, mass and tenors of reducing sugars exhibit a good mathematical correlation, and the mathematical model generated was benchmarked to low-concentration reducing sugars (<0.3%). Using the model created herein, reducing sugars analyses can be automated using the new equipment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe the clinical course of a case of peritonitis caused by Salmonella sp. after an episode of intestinal salmonellosis, and a brief review of the literature is also done.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of new procedures for quickly obtaining accurate information on the physiological potential of seed lots is essential for developing quality control programs for the seed industry. In this study, the effectiveness of an automated system of seedling image analysis (Seed Vigor Imaging System - SVIS) in determining the physiological potential of sun hemp seeds and its relationship with electrical conductivity tests, were evaluated. SVIS evaluations were performed three and four days after sowing and data on the vigor index and the length and uniformity of seedling growth were collected. The electrical conductivity test was made on 50 seed replicates placed in containers with 75 mL of deionised water at 25 ºC and readings were taken after 1, 2, 4, 8 and 16 hours of imbibition. Electrical conductivity measurements at 4 or 8 hours and the use of the SVIS on 3-day old seedlings can effectively detect differences in vigor between different sun hemp seed lots.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing performance of computers has made it possible to solve algorithmically problems for which manual and possibly inaccurate methods have been previously used. Nevertheless, one must still pay attention to the performance of an algorithm if huge datasets are used or if the problem iscomputationally difficult. Two geographic problems are studied in the articles included in this thesis. In the first problem the goal is to determine distances from points, called study points, to shorelines in predefined directions. Together with other in-formation, mainly related to wind, these distances can be used to estimate wave exposure at different areas. In the second problem the input consists of a set of sites where water quality observations have been made and of the results of the measurements at the different sites. The goal is to select a subset of the observational sites in such a manner that water quality is still measured in a sufficient accuracy when monitoring at the other sites is stopped to reduce economic cost. Most of the thesis concentrates on the first problem, known as the fetch length problem. The main challenge is that the two-dimensional map is represented as a set of polygons with millions of vertices in total and the distances may also be computed for millions of study points in several directions. Efficient algorithms are developed for the problem, one of them approximate and the others exact except for rounding errors. The solutions also differ in that three of them are targeted for serial operation or for a small number of CPU cores whereas one, together with its further developments, is suitable also for parallel machines such as GPUs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Today, the user experience and usability in software application are becoming a major design issue due to the adaptation of many processes using new technologies. Therefore, the study of the user experience and usability might be included in every software development project and, thus, they should be tested to get traceable results. As a result of different testing methods to evaluate the concepts, a non-expert on the topic might have doubts on which option he/she should opt for and how to interpret the outcomes of the process. This work aims to create a process to ease the whole testing methodology based on the process created by Seffah et al. and a supporting software tool to follow the procedure of these testing methods for the user experience and usability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research attempted to address the question of the role of explicit algorithms and episodic contexts in the acquisition of computational procedures for regrouping in subtraction. Three groups of students having difficulty learning to subtract with regrouping were taught procedures for doing so through either an explicit algorithm, an episodic content or an examples approach. It was hypothesized that the use of an explicit algorithm represented in a flow chart format would facilitate the acquisition and retention of specific procedural steps relative to the other two conditions. On the other hand, the use of paragraph stories to create episodic content was expected to facilitate the retrieval of algorithms, particularly in a mixed presentation format. The subjects were tested on similar, near, and far transfer questions over a four-day period. Near and far transfer algorithms were also introduced on Day Two. The results suggested that both explicit and episodic context facilitate performance on questions requiring subtraction with regrouping. However, the differential effects of these two approaches on near and far transfer questions were not as easy to identify. Explicit algorithms may facilitate the acquisition of specific procedural steps while at the same time inhibiting the application of such steps to transfer questions. Similarly, the value of episodic context in cuing the retrieval of an algorithm may be limited by the ability of a subject to identify and classify a new question as an exemplar of a particular episodically deflned problem type or category. The implications of these findings in relation to the procedures employed in the teaching of Mathematics to students with learning problems are discussed in detail.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several automated reversed-phase HPLC methods have been developed to determine trace concentrations of carbamate pesticides (which are of concern in Ontario environmental samples) in water by utilizing two solid sorbent extraction techniques. One of the methods is known as on-line pre-concentration'. This technique involves passing 100 milliliters of sample water through a 3 cm pre-column, packed with 5 micron ODS sorbent, at flow rates varying from 5-10 mUmin. By the use of a valve apparatus, the HPLC system is then switched to a gradient mobile phase program consisting of acetonitrile and water. The analytes, Propoxur, Carbofuran, Carbaryl, Propham, Captan, Chloropropham, Barban, and Butylate, which are pre-concentrated on the pre-column, are eluted and separated on a 25 cm C-8 analytical column and determined by UV absorption at 220 nm. The total analytical time is 60 minutes, and the pre-column can be used repeatedly for the analysis of as many as thirty samples. The method is highly sensitive as 100 percent of the analytes present in the sample can be injected into the HPLC. No breakthrough of any of the analytes was observed and the minimum detectable concentrations range from 10 to 480 ng/L. The developed method is totally automated for the analysis of one sample. When the above mobile phase is modified with a buffer solution, Aminocarb, Benomyl, and its degradation product, MBC, can also be detected along with the above pesticides with baseline resolution for all of the analytes. The method can also be easily modified to determine Benomyl and MBC both as solute and as particulate matter. By using a commercially available solid phase extraction cartridge, in lieu of a pre-column, for the extraction and concentration of analytes, a completely automated method has been developed with the aid of the Waters Millilab Workstation. Sample water is loaded at 10 mL/min through a cartridge and the concentrated analytes are eluted from the sorbent with acetonitrile. The resulting eluate is blown-down under nitrogen, made up to volume with water, and injected into the HPLC. The total analytical time is 90 minutes. Fifty percent of the analytes present in the sample can be injected into the HPLC, and recoveries for the above eight pesticides ranged from 84 to 93 percent. The minimum detectable concentrations range from 20 to 960 ng/L. The developed method is totally automated for the analysis of up to thirty consecutive samples. The method has proven to be applicable to both purer water samples as well as untreated lake water samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The (n, k)-star interconnection network was proposed in 1995 as an attractive alternative to the n-star topology in parallel computation. The (n, k )-star has significant advantages over the n-star which itself was proposed as an attractive alternative to the popular hypercube. The major advantage of the (n, k )-star network is its scalability, which makes it more flexible than the n-star as an interconnection network. In this thesis, we will focus on finding graph theoretical properties of the (n, k )-star as well as developing parallel algorithms that run on this network. The basic topological properties of the (n, k )-star are first studied. These are useful since they can be used to develop efficient algorithms on this network. We then study the (n, k )-star network from algorithmic point of view. Specifically, we will investigate both fundamental and application algorithms for basic communication, prefix computation, and sorting, etc. A literature review of the state-of-the-art in relation to the (n, k )-star network as well as some open problems in this area are also provided.