990 resultados para Google API
Resumo:
Tämän tutkielman aiheena on ammattikääntäjien tiedonhaku, kun käytettävissä on ainoastaan verkkolähteitä. Tutkimuksessa on tarkasteltu, mistä ja miten ammattikääntäjät etsivät tietoa internetistä kääntäessään lähtötekstiä englannista suomeen. Lisäksi tutkimuksen tarkoituksena on osoittaa, että tiedonhakutaidot ja lähdekriittisyys ovat käännöskompetensseja, joita tulisi sekä ylläpitää että opettaa osana kääntäjäkoulutusta. Tutkimuksen aineisto kerättiin empiirisesti käyttämällä kolmea metodia. Käännösprosessi ja sen aikana tapahtunut tiedonhaku tallennettiin käyttäen Camtasia-näyttövideointiohjelmaa ja Translog-II -näppäilyntallennusohjelmaa. Lisäksi tutkimukseen osallistuneet kääntäjät täyttivät kaksi kyselyä, joista ensimmäinen sisälsi taustatietokysymyksiä ja toinen itse prosessiin liittyviä retrospektiivisiä kysymyksiä. Kyselyt toteutettiin Webropol-kyselytyökalulla. Aineistoa kerättiin yhteensä viidestä koetilanteesta. Tutkimuksessa tarkasteltiin lähemmin kolmen ammattikääntäjän tiedon-hakutoimintoja erottelemalla käännösprosesseista ne tauot, joiden aikana kääntäjät etsivät tietoa internetistä. Käytettyjen verkkolähteiden osalta tutkimuksessa saatiin vastaavia tuloksia kuin aiemmissakin tutkimuksissa: eniten käytettyjä olivat Google, Wikipedia sekä erilaiset verkkosanakirjat. Tässä tutkimuksessa kuitenkin paljastui, että ammattikääntäjien tiedonhaun toimintamallit vaihtelevat riippuen niin kääntäjän erikoisalasta kuin hänen tiedonhakutaitojensa tasosta. Joutuessaan työskentelemään tutun työympäristönsä ja oman erikoisalansa ulkopuolella turvautuu myös osa ammattikääntäjistä alkeellisimpiin tiedonhakutekniikoihin, joita käännöstieteen opiskelijoiden on havaittu yleisesti käyttävän. Tulokset paljastivat myös, että tiedonhaku voi viedä jopa 70 prosenttia koko käännösprosessiin kuluvasta ajasta riippuen kääntäjän aiemmasta lähtötekstin aihepiiriin liittyvästä tietopohjasta ja tiedonhaun tehokkuudesta. Tutkimuksessa saatujen tulosten pohjalta voidaan sanoa, että myös ammattikääntäjien tulisi kehittää tiedonhakutaitojaan pitääkseen käännösprosessinsa tehokkaana. Lisäksi kääntäjien pitäisi muistaa arvioida kriittisesti käyttämiään tietolähteitä: lähdekritiikki on tarpeen erityisesti verkkolähteitä käytettäessä. Tästä syystä tiedonhakutaitoja ja lähdekriittisyyttä tulisikin opettaa ja harjoitella jo osana kääntäjäkoulutusta. Kääntäjien ei myöskään pidä jättää tiedonhakua pelkkien verkkolähteiden varaan, vaan jatkossakin käyttää hyväkseen niin painettuja tietolähteitä kuin myös henkilölähteitä.
Resumo:
O isolamento e a identificação de microrganismos em leite cru se tornam interessantes do ponto de vista de saúde pública, pois dependendo das espécies isoladas, ações direcionadas podem ser tomadas visando a melhoria de sua qualidade. A deterioração do leite é conseqüência sobretudo do crescimento de microrganismos psicrotróficos, que produzem lipases e proteases termoestáveis que não são desnaturadas durante o processo de pasteurização, conferindo sabores e odores rançoso e amargo, respectivamente. Assim, o objetivo deste trabalho foi isolar e identificar os principais gêneros de bactérias pertencentes à família Enterobacteriaceae, Gram-negativas oxidase positiva, gêneros Staphylococcus e Enterococcus, bem como atividade de lipases e proteases de 16 propriedades rurais do município de Boa Esperança-MG. As bactérias Gram-negativas foram isoladas em ágar eosina azul de metileno (EMB) e ágar Entérico Hektoen. Estafilococos foram isolados em ágar Baird-Parker e Enterococcus em ágar KF. Colônias de interesse foram coletadas e submetidas à coloração de Gram, e às provas de catalase e oxidase. Após esses procedimentos, os isolados selecionados foram identificados utilizando-se Bactray I, II e III; Api 20 Strep; e provas sugeridas pelo Bergey's Manual of Determinative Bacteriology. A identificação sorológica de Enterococcus foi realizada utilizando-se Prolex. O leite oriundo das 16 propriedades continha cepas de microrganismos fecais como Escherichia coli e Enterococcus do grupo D de Lancefield. Bactérias Gram-negativas oxidase positiva foram identificadas em cinco propriedades. Staphylococcus foram encontrados em 10 propriedades. O leite coletado nas fazendas investigadas possui microrganismos que comprometem sua qualidade. Todos os grupos de microrganismos testados revelaram atividades de lipase e protease.
Resumo:
Probiotics are supplementary foods developed by microbial strains that improve animal health beyond basic nutrition. Probiotics are consumed orally, regardless of being considered as normal inhabitants of the intestines, able to survive in enzimatic and biliary secretions. Kefir is a probiotic originated from the old continent, fermented by several bacteria and yeasts, encapsulated in a polyssacharide matrix, and resembles jelly grains. Kefir is also presented as its sourish product both in sugary or milky suspensions containing vitamins, aminoacids, peptides, carbohydrates, ethanol, and volatile compounds. Kefir is known to have a diverse microbial content depending on the country and fermentative substrates, which cause distinct probiotic effects. In this sense, the purpose of this work was to isolate, identify, and quantify the microbial content of a native sugary kefir sample (fermented suspension and lyophilized natural grains). Serial dilutions were plated on Rogosa agar (AR) and De Man, Rogosa and Sharpe (MRS), for Lactobacillus; Brain Heart Infusion (BHI), for total bacteria; Sabouraud-Dextrose-Agar (SDA), for yeasts and filamentous fungi; Thioglycolate Agar (TA), for Streptococcus, Acetobacteria and Leuconostoc; and Coconut Water Agar (CWA), and CWA supplemented with yeast extract (CWAY), for various genera. Genera and species for all strains were identified through biochemical reactions and specific API systems. The microbial profile of kefir was different from other sources of grains despite the presence of similar microorganisms and others which have not been reported yet. The data obtained with the CWA and CWAE media suggest that both substrates are alternative and salutary media for culture of kefir strains.
Resumo:
Yellowfin tuna has a high level of free histidine in their muscle, which can lead to histamine formation by microorganisms if temperature abuse occurs during handling and further processing. The objective of this study was to measure levels of histamine in damaged and undamaged thawed muscle to determine the effect of physical damage on the microbial count and histamine formation during the initial steps of canning processing and to isolate and identify the main histamine-forming microorganisms present in the flesh of yellowfin tuna. Total mesophilic and psicrophilic microorganisms were determined using the standard plate method. The presence of histamine-forming microorganisms was determined in a modified Niven's agar. Strains were further identified using the API 20E kit for enterobacteriaceae and Gram-negative bacilli. Physically damaged tuna did not show higher microbiological contamination than that of undamaged muscle tuna. The most active histamine-forming microorganism present in tuna flesh was Morganella morganii. Other decarboxylating microorganisms present were Enterobacter agglomerans and Enterobacter cloacae. Physical damage of tune during catching and handling did not increase the level of histamine or the amount of microorganisms present in tuna meat during frozen transportation, but they showed a higher risk of histamine-forming microorganism growth during processing.
Resumo:
The emergence of depth sensors has made it possible to track – not only monocular cues – but also the actual depth values of the environment. This is especially useful in augmented reality solutions, where the position and orientation (pose) of the observer need to be accurately determined. This allows virtual objects to be installed to the view of the user through, for example, a screen of a tablet or augmented reality glasses (e.g. Google glass, etc.). Although the early 3D sensors have been physically quite large, the size of these sensors is decreasing, and possibly – eventually – a 3D sensor could be embedded – for example – to augmented reality glasses. The wider subject area considered in this review is 3D SLAM methods, which take advantage of the 3D information available by modern RGB-D sensors, such as Microsoft Kinect. Thus the review for SLAM (Simultaneous Localization and Mapping) and 3D tracking in augmented reality is a timely subject. We also try to find out the limitations and possibilities of different tracking methods, and how they should be improved, in order to allow efficient integration of the methods to the augmented reality solutions of the future.
Resumo:
Introdução: A doença renal crônica (DRC) e o tabagismo são problemas de saúde pública. Objetivo: Analisar o tabagismo como fator risco para a progressão da DRC. Métodos: Realizou-se uma revisão sistemática nas bases Medline, LILACS, SciELO, Google Acadêmico, Trials.gov e Embase com artigos publicados até fevereiro de 2013. Incluíram-se estudos: tipo coorte, ensaios clínicos e caso-controle. Realizados em seres humanos com idade ≥ 18 anos tendo tabagismo como fator de risco para progressão da DRC. Excluíram-se estudos que não referiam tabagismo e DRC no título ou tinham proposta de combate ao fumo. Resultados: Das 94 citações, 12 artigos foram selecionados. Destes, seis eram multicêntricos realizados em países desenvolvidos e quatro foram aleatorizados. Predominou o sexo masculino 51%-76%. Houve progressão associada ao tabagismo em 11 estudos. Identificou-se que o consumo ≥ 15 maços/ ano aumenta o risco de progressão da DRC. Conclusão: Tabagismo é fator de risco para progressão da DRC.
Resumo:
Intelligence from a human source, that is falsely thought to be true, is potentially more harmful than a total lack of it. The veracity assessment of the gathered intelligence is one of the most important phases of the intelligence process. Lie detection and veracity assessment methods have been studied widely but a comprehensive analysis of these methods’ applicability is lacking. There are some problems related to the efficacy of lie detection and veracity assessment. According to a conventional belief an almighty lie detection method, that is almost 100% accurate and suitable for any social encounter, exists. However, scientific studies have shown that this is not the case, and popular approaches are often over simplified. The main research question of this study was: What is the applicability of veracity assessment methods, which are reliable and are based on scientific proof, in terms of the following criteria? o Accuracy, i.e. probability of detecting deception successfully o Ease of Use, i.e. easiness to apply the method correctly o Time Required to apply the method reliably o No Need for Special Equipment o Unobtrusiveness of the method In order to get an answer to the main research question, the following supporting research questions were answered first: What kinds of interviewing and interrogation techniques exist and how could they be used in the intelligence interview context, what kinds of lie detection and veracity assessment methods exist that are reliable and are based on scientific proof and what kind of uncertainty and other limitations are included in these methods? Two major databases, Google Scholar and Science Direct, were used to search and collect existing topic related studies and other papers. After the search phase, the understanding of the existing lie detection and veracity assessment methods was established through a meta-analysis. Multi Criteria Analysis utilizing Analytic Hierarchy Process was conducted to compare scientifically valid lie detection and veracity assessment methods in terms of the assessment criteria. In addition, a field study was arranged to get a firsthand experience of the applicability of different lie detection and veracity assessment methods. The Studied Features of Discourse and the Studied Features of Nonverbal Communication gained the highest ranking in overall applicability. They were assessed to be the easiest and fastest to apply, and to have required temporal and contextual sensitivity. The Plausibility and Inner Logic of the Statement, the Method for Assessing the Credibility of Evidence and the Criteria Based Content Analysis were also found to be useful, but with some limitations. The Discourse Analysis and the Polygraph were assessed to be the least applicable. Results from the field study support these findings. However, it was also discovered that the most applicable methods are not entirely troublefree either. In addition, this study highlighted that three channels of information, Content, Discourse and Nonverbal Communication, can be subjected to veracity assessment methods that are scientifically defensible. There is at least one reliable and applicable veracity assessment method for each of the three channels. All of the methods require disciplined application and a scientific working approach. There are no quick gains if high accuracy and reliability is desired. Since most of the current lie detection studies are concentrated around a scenario, where roughly half of the assessed people are totally truthful and the other half are liars who present a well prepared cover story, it is proposed that in future studies lie detection and veracity assessment methods are tested against partially truthful human sources. This kind of test setup would highlight new challenges and opportunities for the use of existing and widely studied lie detection methods, as well as for the modern ones that are still under development.
Resumo:
Tutkielman tarkoituksena on vertailla verkkosivujen grafiikkatyökaluja. Tutkielmassa käsitellään kahta 2D-grafiikkaan liittyvää web-tekniikkaa: SVG:tä ja HTML5:n canvas-elementtiä. Ensin esitellään molemmat teknologiat, sitten käydään läpi molempien käyttöä esimerkkien ja kuvien avulla. Työssä esitetään myös eri tapoja toteuttaa animaatioita. Lopuksi teknologioita ja niiden käyttömahdollisuuksia vertaillaan. Tutkielmassa pyritään vastaamaan seuraaviin kysymyksiin: Mitä ovat SVG ja HTML5:n canvas-elementti? Miten niitä käytetään? Miten ne eroavat toisistaan? Mihin käyttötarkoituksiin ne sopivat? SVG on skaalautuvan vektorigrafiikan formaatti kaksiulotteisen grafiikan esittämiseen. SVG perustuu vektoreihin ja se kuvataan XML-tyylisellä kielellä. SVG soveltuu tilanteisiin, joissa kuviota halutaan suurentaa ja pienentää laadun kärsimättä. HTML5:n uutuus canvas-elementti luo verkkosivulle piirtoalustan, johon voidaan piirtää JavaScriptillä Canvas API:n kautta. Canvas-alustalle piirretyt kuvat perustuvat bittikarttaan, joten teknologia soveltuu parhaiten pikseleiden tasolla tehtyihin operaatioihin ja esimerkiksi web-ohjelmien toteuttamiseen.
Resumo:
Tässä tutkielmassa kartoitettiin perusopetuksen opetussuunnitelman perusteiden (2014) mukaista käsityön aineenopettajan työnkuvaa lähiaikoina valmistuvien opiskelijoiden näkökulmasta. Tutkimuskysymys on: Minkälaisia näkemyksiä käsityön aineenopettajaopiskelijoilla on käsityön aineenopettajan työnkuvasta? ja alaongelmina Minkälaisia haasteita ja mahdollisuuksia käsityön aineenopettajaopiskelijat näkevät tulevassa työssään? Tutkielma toteutettiin laadullisena fenomenografisena tutkimuksena, jossa vastaajien erilaiset näkemykset ovat kiinnostuksen kohteena. Kysely suoritettiin Google forms -nettikyselynä, joka sisälsi avoimia kysymyksiä ja joka lähetettiin kaikille maisterivaiheen opiskelijoille (58 opiskelijaa), jotka lukevat pääaineenaan käsityökasvatusta. Kyselyyn vastasi 15 opiskelijaa. Vastaukset analysoitiin teoriaohjaavan sisällönanalyysin keinoin, jossa käytettiin Blombergin (2008) mallia opettajan työskentelystä ryhmän jäsenenä (työyhteisö), ryhmän johtajana (opetustilanteet) ja verkostoissa toimijana. Tutkimustulosten mukaan käsityön aineenopettajan työ on opetustilanteisiin painottunutta työskentelyä, jossa tullaan enenevissä määrin korostamaan yhteistyötä niin työyhteisön kuin verkostojenkin tasolla. Työn vaativuuden odotetaan kasvavan ajan puutteen ja kasvavien sisältöjen myötä. Käsityön aineenopettajan tärkeimmät tehtävät ovat arjen perustaitojen opettaminen ja mielihyvän aikaansaaminen oppilaille. Myös omaa ammatillista kehittymistä pidettiin tärkeänä. Yksilöllisen ohjauksen tarve, työturvallisuus ja työskentelytilat ovat käsityön aineenopettajan työn erityispiirteitä. Uudet opetussuunnitelman perusteet (2014) mahdollistavat monenlaiset oppiaineen toteutuksen tavat ja yhteistyön mahdollisuudet. Toisaalta opiskelijat kokevat perusteiden tulkinnan olevan haastavaa. Myös teknologinen kehitys ja ajan puute asettavat omat haasteensa käsityön aineenopettajan työlle. Johtopäätöksinä voidaan todeta opiskelijoiden olevan valmiita kehittämään käsityöoppiainetta monimateriaaliseen suuntaan ja toteuttamaan käsityön ja muiden aineiden integrointia. Opiskelijat ovat myös selkeästi perehtyneet opetussuunnitelman perusteisiin (2014). Jatkotutkimusaiheissa haluaisimme tutkia: Miten käsityön aineenopettajat tulkitsevat opetussuunnitelman perusteita (2014)? Käsityöoppiaineen tulevaisuus, muuttuuko sisältö vai pysyykö samana? Minkälaisia ovat tulevaisuuden valinnaisaineet? Toteutuuko yhteisopettajuus käsitöissä tulevaisuudessa?
Resumo:
Tässä työssä käsitellään kävijäseurannan menetelmiä ja toteutetaan niitä käytännössä. Web-analytiikkaohjelmistojen toimintaan tutustutaan, pääasiassa keskittyen Google Analyticsiin. Tavoitteena on selvittää Lappeenrannan matkailulaitepäätteiden käyttömääriä ja eriyttää niitä laitekohtaisesti. Web-analytiikasta tehdään kirjallisuuskatsaus ja kävijäseurantadataa analysoidaan sekä vertaillaan kahdesta eri verkkosivustosta. Lisäksi matkailulaitepäätteiden verkkosivuston lokeja tarkastellaan tiedonlouhinnan keinoin tarkoitusta varten kehitetyllä Python-sovelluksella. Työn pohjalta voidaan todeta, ettei matkailulaitepäätteiden käyttömääriä voida nykyisen toteutuksen perusteella eriyttää laitekohtaisesti. Istuntojen määrää ja tapahtumia voidaan kuitenkin seurata. Matkailulaitepäätteiden kävijäseurannassa tunnistetaan useita ongelmia, kuten päätteiden automaattisen verkkosivunpäivityksen tuloksia vääristävä vaikutus, osittainen Google Analytics -integraatio ja tärkeimpänä päätteen yksilöivän tunnistetiedon puuttuminen. Työssä ehdotetaan ratkaisuja, joilla mahdollistetaan kävijäseurannan tehokas käyttö ja laitekohtainen seuranta. Saadut tulokset korostavat kävijäseurannan toteutuksen suunnitelmallisuuden tärkeyttä.
Resumo:
Value of online business has grown to over one trillion USD. This thesis is about search engine optimization, which focus is to increase search engine rankings. Search engine optimization is an important branch of online marketing because the first page of search engine results is generating majority of the search traffic. Current articles about search engine optimization and Google are indicating that with the proper use of quality content, there is potential to improve search engine rankings. However, the existing search engine optimization literature is not noticing content at a sufficient level. To decrease that difference, the content-centered method for search engine optimization is constructed, and content in search engine optimization is studied. This content-centered method consists of three search engine optimization tactics: 1) content, 2) keywords, and 3) links. Two propositions were used for testing these tactics in a real business environment and results are suggesting that the content-centered method is improving search engine rankings. Search engine optimization is constantly changing because Google is adjusting its search algorithm regularly. Still, some long-term trends can be recognized. Google has said that content is growing its importance as a ranking factor in the future. The content-centered method is taking advance of this new trend in search engine optimization to be relevant for years to come.
Resumo:
Value of online business has grown to over one trillion USD. This thesis is about search engine optimization, which focus is to increase search engine rankings. Search engine optimization is an important branch of online marketing because the first page of search engine results is generating majority of the search traffic. Current articles about search engine optimization and Google are indicating that with the proper use of quality content, there is potential to improve search engine rankings. However, the existing search engine optimization literature is not noticing content at a sufficient level. To decrease that difference, the content-centered method for search engine optimization is constructed, and content in search engine optimization is studied. This content-centered method consists of three search engine optimization tactics: 1) content, 2) keywords, and 3) links. Two propositions were used for testing these tactics in a real business environment and results are suggesting that the content-centered method is improving search engine rankings. Search engine optimization is constantly changing because Google is adjusting its search algorithm regularly. Still, some long-term trends can be recognized. Google has said that content is growing its importance as a ranking factor in the future. The content-centered method is taking advance of this new trend in search engine optimization to be relevant for years to come.
Resumo:
This study examines the efficiency of search engine advertising strategies employed by firms. The research setting is the online retailing industry, which is characterized by extensive use of Web technologies and high competition for market share and profitability. For Internet retailers, search engines are increasingly serving as an information gateway for many decision-making tasks. In particular, Search engine advertising (SEA) has opened a new marketing channel for retailers to attract new customers and improve their performance. In addition to natural (organic) search marketing strategies, search engine advertisers compete for top advertisement slots provided by search brokers such as Google and Yahoo! through keyword auctions. The rationale being that greater visibility on a search engine during a keyword search will capture customers' interest in a business and its product or service offerings. Search engines account for most online activities today. Compared with the slow growth of traditional marketing channels, online search volumes continue to grow at a steady rate. According to the Search Engine Marketing Professional Organization, spending on search engine marketing by North American firms in 2008 was estimated at $13.5 billion. Despite the significant role SEA plays in Web retailing, scholarly research on the topic is limited. Prior studies in SEA have focused on search engine auction mechanism design. In contrast, research on the business value of SEA has been limited by the lack of empirical data on search advertising practices. Recent advances in search and retail technologies have created datarich environments that enable new research opportunities at the interface of marketing and information technology. This research uses extensive data from Web retailing and Google-based search advertising and evaluates Web retailers' use of resources, search advertising techniques, and other relevant factors that contribute to business performance across different metrics. The methods used include Data Envelopment Analysis (DEA), data mining, and multivariate statistics. This research contributes to empirical research by analyzing several Web retail firms in different industry sectors and product categories. One of the key findings is that the dynamics of sponsored search advertising vary between multi-channel and Web-only retailers. While the key performance metrics for multi-channel retailers include measures such as online sales, conversion rate (CR), c1ick-through-rate (CTR), and impressions, the key performance metrics for Web-only retailers focus on organic and sponsored ad ranks. These results provide a useful contribution to our organizational level understanding of search engine advertising strategies, both for multi-channel and Web-only retailers. These results also contribute to current knowledge in technology-driven marketing strategies and provide managers with a better understanding of sponsored search advertising and its impact on various performance metrics in Web retailing.
Resumo:
This work consists of a theoretical part and an experimental one. The first part provides a simple treatment of the celebrated von Neumann minimax theorem as formulated by Nikaid6 and Sion. It also discusses its relationships with fundamental theorems of convex analysis. The second part is about externality in sponsored search auctions. It shows that in these auctions, advertisers have externality effects on each other which influence their bidding behavior. It proposes Hal R.Varian model and shows how adding externality to this model will affect its properties. In order to have a better understanding of the interaction among advertisers in on-line auctions, it studies the structure of the Google advertisements networ.k and shows that it is a small-world scale-free network.
Resumo:
Complex networks have recently attracted a significant amount of research attention due to their ability to model real world phenomena. One important problem often encountered is to limit diffusive processes spread over the network, for example mitigating pandemic disease or computer virus spread. A number of problem formulations have been proposed that aim to solve such problems based on desired network characteristics, such as maintaining the largest network component after node removal. The recently formulated critical node detection problem aims to remove a small subset of vertices from the network such that the residual network has minimum pairwise connectivity. Unfortunately, the problem is NP-hard and also the number of constraints is cubic in number of vertices, making very large scale problems impossible to solve with traditional mathematical programming techniques. Even many approximation algorithm strategies such as dynamic programming, evolutionary algorithms, etc. all are unusable for networks that contain thousands to millions of vertices. A computationally efficient and simple approach is required in such circumstances, but none currently exist. In this thesis, such an algorithm is proposed. The methodology is based on a depth-first search traversal of the network, and a specially designed ranking function that considers information local to each vertex. Due to the variety of network structures, a number of characteristics must be taken into consideration and combined into a single rank that measures the utility of removing each vertex. Since removing a vertex in sequential fashion impacts the network structure, an efficient post-processing algorithm is also proposed to quickly re-rank vertices. Experiments on a range of common complex network models with varying number of vertices are considered, in addition to real world networks. The proposed algorithm, DFSH, is shown to be highly competitive and often outperforms existing strategies such as Google PageRank for minimizing pairwise connectivity.