916 resultados para Visual search method
Resumo:
The number of digital images has been increasing exponentially in the last few years. People have problems managing their image collections and finding a specific image. An automatic image categorization system could help them to manage images and find specific images. In this thesis, an unsupervised visual object categorization system was implemented to categorize a set of unknown images. The system is unsupervised, and hence, it does not need known images to train the system which needs to be manually obtained. Therefore, the number of possible categories and images can be huge. The system implemented in the thesis extracts local features from the images. These local features are used to build a codebook. The local features and the codebook are then used to generate a feature vector for an image. Images are categorized based on the feature vectors. The system is able to categorize any given set of images based on the visual appearance of the images. Images that have similar image regions are grouped together in the same category. Thus, for example, images which contain cars are assigned to the same cluster. The unsupervised visual object categorization system can be used in many situations, e.g., in an Internet search engine. The system can categorize images for a user, and the user can then easily find a specific type of image.
Resumo:
The research on language equations has been active during last decades. Compared to the equations on words the equations on languages are much more difficult to solve. Even very simple equations that are easy to solve for words can be very hard for languages. In this thesis we study two of such equations, namely commutation and conjugacy equations. We study these equations on some limited special cases and compare some of these results to the solutions of corresponding equations on words. For both equations we study the maximal solutions, the centralizer and the conjugator. We present a fixed point method that we can use to search these maximal solutions and analyze the reasons why this method is not successful for all languages. We give also several examples to illustrate the behaviour of this method.
Resumo:
Robotic platforms have advanced greatly in terms of their remote sensing capabilities, including obtaining optical information using cameras. Alongside these advances, visual mapping has become a very active research area, which facilitates the mapping of areas inaccessible to humans. This requires the efficient processing of data to increase the final mosaic quality and computational efficiency. In this paper, we propose an efficient image mosaicing algorithm for large area visual mapping in underwater environments using multiple underwater robots. Our method identifies overlapping image pairs in the trajectories carried out by the different robots during the topology estimation process, being this a cornerstone for efficiently mapping large areas of the seafloor. We present comparative results based on challenging real underwater datasets, which simulated multi-robot mapping
Resumo:
In this paper, a simple and rapid method of evaluating galvanized steel sheet corrosion in a CuSO4 solution, as an experimentation proposal for corrosion teaching. Galvanized steel corrosion is present in tanks and tubing by leading of natural or industrial waters which contain soluble copper compounds. This was the rationale for choosing the Cu2+ ions solution as an oxidizing agent. The method principle is based on visual colorimetry because the used oxidant has an intense blue color. Thus, a change in its concentration as a result of the corrosive process can be followed by a color intensity change in the solution thereby allowing evaluation of the corrosion rate.
Resumo:
This thesis presents two graphical user interfaces for the project DigiQ - Fusion of Digital and Visual Print Quality, a project for computationally modeling the subjective human experience of print quality by measuring the image with certain metrics. After presenting the user interfaces, methods for reducing the computation time of several of the metrics and the image registration process required to compute the metrics, and details of their performance are given. The weighted sample method for the image registration process was able to signifigantly decrease the calculation times while resulting in some error. The random sampling method for the metrics greatly reduced calculation time while maintaining excellent accuracy, but worked with only two of the metrics.
Resumo:
The aim of this study is to investigate the consumer search behavior in high involvement purchases. The results of this research provide the descriptive analysis of the information search phase which is a part of the decision-making process. The study focuses on customer’s choice of the information sources, motivation behind it and different factors that influence the search behavior. Particular attention is paid to the purchase categorization and the differences in information search between products and services. The qualitative research method is chosen for this study. The data is gathered through ten theme interviews. Each participant of the interview describes his/her own search behavior in a product and a service case. The results indicate that consumer search behavior vary according to the purchase categorization, demographic, individual and situational factors. Moreover, the above-mentioned factors influence the purpose and position of the information search phase in a five-step decision making model.
Resumo:
The problem of understanding how humans perceive the quality of a reproduced image is of interest to researchers of many fields related to vision science and engineering: optics and material physics, image processing (compression and transfer), printing and media technology, and psychology. A measure for visual quality cannot be defined without ambiguity because it is ultimately the subjective opinion of an “end-user” observing the product. The purpose of this thesis is to devise computational methods to estimate the overall visual quality of prints, i.e. a numerical value that combines all the relevant attributes of the perceived image quality. The problem is limited to consider the perceived quality of printed photographs from the viewpoint of a consumer, and moreover, the study focuses only on digital printing methods, such as inkjet and electrophotography. The main contributions of this thesis are two novel methods to estimate the overall visual quality of prints. In the first method, the quality is computed as a visible difference between the reproduced image and the original digital (reference) image, which is assumed to have an ideal quality. The second method utilises instrumental print quality measures, such as colour densities, measured from printed technical test fields, and connects the instrumental measures to the overall quality via subjective attributes, i.e. attributes that directly contribute to the perceived quality, using a Bayesian network. Both approaches were evaluated and verified with real data, and shown to predict well the subjective evaluation results.
Resumo:
The concept of Process Management has been used by managers and consultants that search for the improvement of both operational or managerial industrial processes. Its strength is in focusing on the external client and on the optimization of the internal process in order to fulfill their needs. By the time the needs of internal clients are being sought, a set of improvements takes place. The Taguchi method, because of its claim for knowledge share between design engineers and people engaged in the process, is a candidate for process management implementation. The objective of this paper is to propose that kind of application aiming for improvements related with reliability of results revealed by the robust design of Taguchi method.
Resumo:
The theoretical research of the study focused to business process management and business process modeling, the goal was to found a new business process modeling method for electrical accessories manufacturing enterprise. The focus was to find few options for business process modeling methods where company could have chosen the best one for its needs The study was carried out as a qualitative research with an action study and a case study as the most important ways collect data. In the empirical part of the study examples of company’s processes modeled with the new modeling method and process modeling process are presented. The new way of modeling processes improves especially visual presentation of the processes and improves the understanding how employees should work in the organizational interfaces of the process and in the interfaces between different processes. The results of the study is a new unified way to model company’s processes, which makes it easier to understand and create the process models. This improved readability makes it possible to reduce the costs that were created from the unclear old process models.
Resumo:
Yandex is the dominant search engine in Russia, followed by the world leader Google. This study focuses on the performance differences between the two in search advertising in the context of tourism, by running two identical campaigns and measuring the KPI’s, such as CPA (cost-per-action), on both campaigns. Search engine advertising is a new and fast changing form of advertising, which should be studied frequently in order to keep up with the changes. Research was done as an experimental study in cooperation with a Finnish tourism company and the data is gathered from the clickstream and not from questionnaires, which is recommended method by the literature. The results of the study suggests that Yandex.Direct performed better in the selected niche and that the individual campaign planning for Yandex.Direct and Google AdWords is an important part of the optimization of search advertising in Russia.
Resumo:
We studied the distribution of NADPH-diaphorase activity in the visual cortex of normal adult New World monkeys (Saimiri sciureus) using the malic enzyme "indirect" method. NADPH-diaphorase neuropil activity had a heterogeneous distribution. In coronal sections, it had a clear laminar pattern that was coincident with Nissl-stained layers. In tangential sections, we observed blobs in supragranular layers of V1 and stripes throughout the entire V2. We quantified and compared the tangential distribution of NADPH-diaphorase and cytochrome oxidase blobs in adjacent sections of the supragranular layers of V1. Although their spatial distributions were rather similar, the two enzymes did not always overlap. The histochemical reaction also revealed two different types of stained cells: a slightly stained subpopulation and a subgroup of deeply stained neurons resembling a Golgi impregnation. These neurons were sparsely spined non-pyramidal cells. Their dendritic arbors were very well stained but their axons were not always evident. In the gray matter, heavily stained neurons showed different dendritic arbor morphologies. However, most of the strongly reactive cells lay in the subjacent white matter, where they presented a more homogenous morphology. Our results demonstrate that the pattern of NADPH-diaphorase activity is similar to that previously described in Old World monkeys
Resumo:
Tässä diplomityössä tutkitaan, miten verkkokaupan kävijävirran käyttäytymistä analysoimalla voidaan tehdä perusteltuja, tarkoituksenmukaisiin nimikkeisiin ja niiden parametreihin kohdistuvia päätöksiä tilanteessa, jossa laajamittaisemmat historiatiedot toteutuneesta myynnistä puuttuvat. Teoriakatsauksen perusteella muodostettiin ratkaisumalli, joka perustuu potentiaalisten kysyntäajurien muodostamiseen ja testaamiseen. Testisarjan perusteella valittavaa ajuria käytetään estimoimaan nimikkeiden kysyntää, jolloin sitä voidaan käyttää toteutuneen myynnin sijasta esimerkiksi Pareto-analyysissä. Näin huomio on mahdollista keskittää rajattuun määrään merkitykseltään suuria nimikkeitä ja niiden yksityiskohtaisiin parametreihin, joilla on merkitystä asiakkaan ostopäätöstilanteissa. Lisäksi voidaan tunnistaa nimikkeitä, joiden ongelmana on joko huono verkkonäkyvyys tai yhteensopimattomuus asiakastarpeiden kanssa. Ajurien testaamisperiaatteena käytetään kertymäfunktioiden yhdenmukaisuustarkastelua, joka rakentuu kolmesta peräkkäisestä vaiheesta; visuaalisesta tarkastelusta, kahden otoksen 2-suuntaisesta Kolmogorov-Smirnov-yhteensopivuustestistä ja Pearsonin korrelaatiotestistä. Mallia ja sen avulla tuotettua kysynnän ajuria testattiin veneilyalan kuluttaja-asiakkaille suunnatussa verkkokaupassa, jossa sillä tunnistettiin Pareto-jakauman alkupäästä runsaasti nimikkeitä, joiden parametreissa oli myynnin kannalta epäedullisia tekijöitä. Jakauman toisessa päässä tunnistettiin satoja nimikkeitä, joiden ongelmana on ilmeisesti joko huono verkkonäkyvyys tai nimikkeiden yhteensopimattomuus asiakastarpeiden kanssa.
Resumo:
Several methods are used to estimate anaerobic threshold (AT) during exercise. The aim of the present study was to compare AT obtained by a graphic visual method for the estimate of ventilatory and metabolic variables (gold standard), to a bi-segmental linear regression mathematical model of Hinkley's algorithm applied to heart rate (HR) and carbon dioxide output (VCO2) data. Thirteen young (24 ± 2.63 years old) and 16 postmenopausal (57 ± 4.79 years old) healthy and sedentary women were submitted to a continuous ergospirometric incremental test on an electromagnetic braking cycloergometer with 10 to 20 W/min increases until physical exhaustion. The ventilatory variables were recorded breath-to-breath and HR was obtained beat-to-beat over real time. Data were analyzed by the nonparametric Friedman test and Spearman correlation test with the level of significance set at 5%. Power output (W), HR (bpm), oxygen uptake (VO2; mL kg-1 min-1), VO2 (mL/min), VCO2 (mL/min), and minute ventilation (VE; L/min) data observed at the AT level were similar for both methods and groups studied (P > 0.05). The VO2 (mL kg-1 min-1) data showed significant correlation (P < 0.05) between the gold standard method and the mathematical model when applied to HR (r s = 0.75) and VCO2 (r s = 0.78) data for the subjects as a whole (N = 29). The proposed mathematical method for the detection of changes in response patterns of VCO2 and HR was adequate and promising for AT detection in young and middle-aged women, representing a semi-automatic, non-invasive and objective AT measurement.
Resumo:
Freezing of gait (FOG) can be assessed by clinical and instrumental methods. Clinical examination has the advantage of being available to most clinicians; however, it requires experience and may not reveal FOG even for cases confirmed by the medical history. Instrumental methods have an advantage in that they may be used for ambulatory monitoring. The aim of the present study was to describe and evaluate a new instrumental method based on a force sensitive resistor and Pearson's correlation coefficient (Pcc) for the assessment of FOG. Nine patients with Parkinson's disease in the "on" state walked through a corridor, passed through a doorway and made a U-turn. We analyzed 24 FOG episodes by computing the Pcc between one "regular/normal" step and the rest of the steps. The Pcc reached ±1 for "normal" locomotion, while correlation diminished due to the lack of periodicity during FOG episodes. Gait was assessed in parallel with video. FOG episodes determined from the video were all detected with the proposed method. The computed duration of the FOG episodes was compared with those estimated from the video. The method was sensitive to various types of freezing; although no differences due to different types of freezing were detected. The study showed that Pcc analysis permitted the computerized detection of FOG in a simple manner analogous to human visual judgment, and its automation may be useful in clinical practice to provide a record of the history of FOG.
Resumo:
A method for determining aflatoxins B1 (AFB1), B2 (AFB2),G1 (AFG1) andG2 (AFG2) in maize with florisil clean up was optimised aiming at one-dimensional thin layer chromatography (TLC) analysis with visual and densitometric quantification. Aflatoxins were extracted with chloroform: water (30:1, v/v), purified through florisil cartridges, separated on TLC plate, detected and quantified by visual and densitometric analysis. The in-house method performance characteristics were determined by using spiked, naturally contaminated maize samples, and certified reference material. The mean recoveries for aflatoxins were 94.2, 81.9, 93.5 and 97.3% in the range of 1.0 to 242 µg/kg for AFB1, 0.3 to 85mg/kg for AFB2, 0.6 to 148mg/kg for AFG1 and 0.6 to 140mg/kg for AFG2, respectively. The correlation values between visual and densitometric analysis for spiked samples were higher than 0.99 for AFB1, AFB2, AFG1 and 0.98 for AFG2. The mean relative standard deviations (RSD) for spiked samples were 16.2, 20.6, 12.8 and 16.9% for AFB1, AFB2, AFG1 and AFG2, respectively. The RSD of the method for naturally contaminated sample (n = 5) was 16.8% for AFB1 and 27.2% for AFB2. The limits of detection of the method (LD) were 0.2, 0.1, 0.1 and 0.1mg/kg and the limits of quantification (LQ) were 1.0, 0.3, 0.6 and 0.6mg/kg for AFB1, AFB2, AFG1 and AFG2, respectively.