310 resultados para surf oholak
Resumo:
Data archives with raw data of burrowing times and proportions of succesfully burrowed clams.
Resumo:
Cruise Mn-74-02 of the R/V MOANA WAVE was the second part of the field work of the NSF/IDOE Inter-University Ferromanganese Research Program in 1974, and we gratefully acknowledge the support of the office for the International Decade of Ocean Exploration and the Office of Oceanographic Facilities and Support. This program was designed to investigate the origin, growth, and distribution of copper/nickel-rich manganese nodules in the Pacific Ocean. The field effort was designed to satisfy sample requirements of the fifteen principal investigators, while increasing general knowledge of the copper/nickel-rich nodule deposits of the equatorial Pacific. This report is the second of a series of cruise reports designed to assist sample requests for documented nodules, sediment, and water samples so that laboratory results can be realistically compared and related to the environment of nodule growth. Nodule samples and bathymetric and navigational data are archived at the Hawaii Institute of Geophysics, University of Hawaii. Bulk chemical analyses of nodules and reduction of survey data were carried out at Hawaii. Sediment cores were stored at the University of Hawaii and at Scripps Institution of Oceanography. The SIO analytical facility provided stratigraphic data on sediment chemistry.
Resumo:
Nanotechnology is a multidisciplinary science that is having a boom today, providing new products with attractive physicochemical properties for many applications. In agri/feed/food sector, nanotechnology offers great opportunities for obtaining products and innovative applications for agriculture and livestock, water treatment and the production, processing, storage and packaging of food. To this end, a wide variety of nanomaterials, ranging from metals and inorganic metal oxides to organic nanomaterials carrying bioactive ingredients are applied. This review shows an overview of current and future applications of nanotechnology in the food industry. Food additives and materials in contact with food are now the main applications, while it is expected that in the future are in the field of nano-encapsulated and nanocomposites in applications as novel foods, additives, biocides, pesticides and materials food contact.
Resumo:
Tsar Peter the Great ruled Russia between 1689 and 1725. Its domains, stretching from the Baltic Sea in the west to the Pacific Ocean in the east. From north to south, its empire stretching from the Arctic Ocean to the borders with China and India. Tsar Peter I tried to extend the geographical knowledge of his government and the rest of the world. He was also interested in the expansion of trade in Russia and in the control of trade routes. Feodor Luzhin and Ivan Yeverinov explored the eastern border of the Russian Empire, the trip between 1719 and 1721 and reported to the Tsar. They had crossed the peninsula of Kamchatka, from west to east and had traveled from the west coast of Kamchatka to the Kuril Islands. The information collected led to the first map of Kamchatka and the Kuril Islands. Tsar Peter ordered Bering surf the Russian Pacific coast, build ships and sail the seas north along the coast to regions of America. The second expedition found equal to those of the previous explorers difficulties. Two ships were eventually thrown away in Okhotsk in 1740. The explorers spent the winter of 1740-1741 stockpiling supplies and then navigate to Petropavlovsk. The two ships sailed eastward and did together until June 20, then separated by fog. After searching Chirikov and his boat for several days, Bering ordered the San Pedro continue to the northeast. There the Russian sailors first sighted Alaska. According to the log, "At 12:30 (pm July 17) in sight of snow-capped mountains and between them a high volcano." This finding came the day of St. Elijah and so named the mountain.
Resumo:
This presentation summarizes experience with the automated speech recognition and translation approach realised in the context of the European project EMMA.
Resumo:
This presentation explains how RAGE develops reusable game technology components and provides examples of their application.
Resumo:
[EN] Sea turtles bury their eggs in the sand of the beach, where they incuba te. After a period of approximately two months, hatchlings break the eggshell and remain inside the chamber for three to seven days (Hays & Speakman, 1993). Then they leave the nest and emerge to the surface of the beach, going quickly towards the surf, to begin their pelagic and developmental stage (e.g., López-Jurado & Andreu, 1998). Hatchlings usually do not emerge from the nest as a single group. They emerge in groups at different moments, resulting in more than one emergence per nest during sorne days (Whitherington et al.,4 1990; Hays et al., 1992; Peters et al., 1994).
Resumo:
Au cours des dernières décennies, l’effort sur les applications de capteurs infrarouges a largement progressé dans le monde. Mais, une certaine difficulté demeure, en ce qui concerne le fait que les objets ne sont pas assez clairs ou ne peuvent pas toujours être distingués facilement dans l’image obtenue pour la scène observée. L’amélioration de l’image infrarouge a joué un rôle important dans le développement de technologies de la vision infrarouge de l’ordinateur, le traitement de l’image et les essais non destructifs, etc. Cette thèse traite de la question des techniques d’amélioration de l’image infrarouge en deux aspects, y compris le traitement d’une seule image infrarouge dans le domaine hybride espacefréquence, et la fusion d’images infrarouges et visibles employant la technique du nonsubsampled Contourlet transformer (NSCT). La fusion d’images peut être considérée comme étant la poursuite de l’exploration du modèle d’amélioration de l’image unique infrarouge, alors qu’il combine les images infrarouges et visibles en une seule image pour représenter et améliorer toutes les informations utiles et les caractéristiques des images sources, car une seule image ne pouvait contenir tous les renseignements pertinents ou disponibles en raison de restrictions découlant de tout capteur unique de l’imagerie. Nous examinons et faisons une enquête concernant le développement de techniques d’amélioration d’images infrarouges, et ensuite nous nous consacrons à l’amélioration de l’image unique infrarouge, et nous proposons un schéma d’amélioration de domaine hybride avec une méthode d’évaluation floue de seuil amélioré, qui permet d’obtenir une qualité d’image supérieure et améliore la perception visuelle humaine. Les techniques de fusion d’images infrarouges et visibles sont établies à l’aide de la mise en oeuvre d’une mise en registre précise des images sources acquises par différents capteurs. L’algorithme SURF-RANSAC est appliqué pour la mise en registre tout au long des travaux de recherche, ce qui conduit à des images mises en registre de façon très précise et des bénéfices accrus pour le traitement de fusion. Pour les questions de fusion d’images infrarouges et visibles, une série d’approches avancées et efficaces sont proposés. Une méthode standard de fusion à base de NSCT multi-canal est présente comme référence pour les approches de fusion proposées suivantes. Une approche conjointe de fusion, impliquant l’Adaptive-Gaussian NSCT et la transformée en ondelettes (Wavelet Transform, WT) est propose, ce qui conduit à des résultats de fusion qui sont meilleurs que ceux obtenus avec les méthodes non-adaptatives générales. Une approche de fusion basée sur le NSCT employant la détection comprime (CS, compressed sensing) et de la variation totale (TV) à des coefficients d’échantillons clairsemés et effectuant la reconstruction de coefficients fusionnés de façon précise est proposée, qui obtient de bien meilleurs résultats de fusion par le biais d’une pré-amélioration de l’image infrarouge et en diminuant les informations redondantes des coefficients de fusion. Une procédure de fusion basée sur le NSCT utilisant une technique de détection rapide de rétrécissement itératif comprimé (fast iterative-shrinking compressed sensing, FISCS) est proposée pour compresser les coefficients décomposés et reconstruire les coefficients fusionnés dans le processus de fusion, qui conduit à de meilleurs résultats plus rapidement et d’une manière efficace.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica
Resumo:
Dados batimétricos detalhados obtidos na antepraia adjacente ao Molhe Oeste conhecida como banco das Três Marias, evidenciaram a presença de uma fossa junto a extremidade do molhe e um banco na forma de domo a SO da depressão. O banco tem sua base e crista respectivamente nas cotas de 8 e 6 m. A feição é aproximadamente oval, com o eixo maior semiparalelo à costa (sentido NNE – SSO) medindo aproximadamente 1600 m. O eixo menor, quase transversal à costa (sentido L-O), apresenta cerca de 1200 m de extensão. O banco representa o resquício do lobo terminal do delta de maré vazante (barra) da laguna formado durante a fixação de sua desembocadura. A fossa junto ao cabeço do molhe apresenta pendente íngreme, que inicia na isóbata de 8 m e alcança a isóbata dos 17 m, sendo provavelmente formada pela ação da corrente longitudinal de SO para NE. Foram coletadas também amostras de sedimento. Estes dados evidenciaram ao menos três subambientes refletindo diferentes níveis de energia. Do mais enérgico para o menos: o banco, onde o sedimento é mais grosso e melhor selecionado, a antepraia, com o sedimento pouco mais fino e a fossa, onde foi amostrada lama, mal selecionada.
Resumo:
The studies have aimed to overcome the confusing variety of existing persistent identifier systems, by; analysing the current national URN:NBN and other identifier initiatives providing guidelines for an international harmonized persistent identifier framework that serves the long-term preservation needs of the research and cultural heritage communities advising these communities about a roadmap to gain the potential benefits. This roadmap also includes a blueprint for an organisation for the distribution and maintenance of the Persistent Identifier infrastructure. These studies are connected to the broader PersId project with DEFF, SURF, DANS, the national libraries of Germany, Finland and Sweden and CNR and FDR from Italy. A number of organisations have been involved in the process: Europeana, the British library, the Dutch Royal Library, the National library of Norway and the Ministry of Education, Flanders, Belgium. PersID - III: Current State and State of the Art (IIIa) & User Requirements (IIIb) (Persistent Identifier: urn:nbn:nl:ui:13-9g4-i1s) PersID - IV: Prototype for a Meta Resolver System/ Work on Standards (Persistent Identifier: urn:nbn:nl:ui:13-wt1-6n9) PersID - V: Sustainability (Persistent Identifier: urn:nbn:nl:ui:13-o4p-8py) Please note that there are also two broader reports on the project as a whole: PersID - I: Project report and II:Communication. For further information please visit the website of the Persistent Identifier project: www.persid.org
Resumo:
The workshop took place on 16-17 January in Utrecht, with Seventy experts from eight European countries in attendance. The workshop was structured in six sessions: usage statistics research paper metadata exchanging information author identification Open Archives Initiative eTheses Following the workshop, the discussion groups were asked to continue their collaboration and to produce a report for circulation to all participants. The results can be downloaded below. The recommendations contained in the reports above have been reviewed by the Knowledge Exchange partner organisations and formed the basis for new proposals and the next steps in Knowledge Exchange work with institutional repositories. Institutional Repository Workshop - Next steps During April and May 2007 Knowledge Exchange had expert reviewers from the partner organisations go though the workshop strand reports and make their recommendations about the best way to move forward, to set priorities, and find possibilities for furthering the institutional repository cause. The KE partner representatives reviewed the reviews and consulted with their partner organisation management to get an indication of support and funding for the latest ideas and proposals, as follows: Pragmatic interoperability During a review meeting at JISC offices in London on 31 May, the expert reviewers and the KE partner representatives agreed that ‘pragmatic interoperability' is the primary area of interest. It was also agreed that the most relevant and beneficial choice for a Knowledge Exchange approach would be to aim for CRIS-OAR interoperability as a step towards integrated services. Within this context, interlinked joint projects could be undertaken by the partner organisations regarding the areas that most interested them. Interlinked projects The proposed Knowledge Exchange activities involve interlinked joint projects on metadata, persistent author identifiers, and eTheses which are intended to connect to and build on projects such as ISPI, Jisc NAMES and the Digital Author Identifier (DAI) developed by SURF. It is important to stress that the projects are not intended to overlap, but rather to supplement the DRIVER 2 (EU project) approaches. Focus on CRIS and OAR It is believed that the focus on practical interoperability between Current Research Information Systems and Open Access Repository systems will be of genuine benefit to research scientists, research administrators and librarian communities in the Knowledge Exchange countries; accommodating the specific needs of each group. Timing June 2007: Write the draft proposal by KE Working Group members July 2007: Final proposal to be sent to partner organisations by KE Group August 2007: Decision by Knowledge Exchange partner organisations.
Resumo:
Following the workshop on new developments in daily licensing practice in November 2011, we brought together fourteen representatives from national consortia (from Denmark, Germany, Netherlands and the UK) and publishers (Elsevier, SAGE and Springer) met in Copenhagen on 9 March 2012 to discuss provisions in licences to accommodate new developments. The one day workshop aimed to: present background and ideas regarding the provisions KE Licensing Expert Group developed; introduce and explain the provisions the invited publishers currently use;ascertain agreement on the wording for long term preservation, continuous access and course packs; give insight and more clarity about the use of open access provisions in licences; discuss a roadmap for inclusion of the provisions in the publishers’ licences; result in report to disseminate the outcome of the meeting. Participants of the workshop were: United Kingdom: Lorraine Estelle (Jisc Collections) Denmark: Lotte Eivor Jørgensen (DEFF), Lone Madsen (Southern University of Denmark), Anne Sandfær (DEFF/Knowledge Exchange) Germany: Hildegard Schaeffler (Bavarian State Library), Markus Brammer (TIB) The Netherlands: Wilma Mossink (SURF), Nol Verhagen (University of Amsterdam), Marc Dupuis (SURF/Knowledge Exchange) Publishers: Alicia Wise (Elsevier), Yvonne Campfens (Springer), Bettina Goerner (Springer), Leo Walford (Sage) Knowledge Exchange: Keith Russell The main outcome of the workshop was that it would be valuable to have a standard set of clauses which could used in negotiations, this would make concluding licences a lot easier and more efficient. The comments on the model provisions the Licensing Expert group had drafted will be taken into account and the provisions will be reformulated. Data and text mining is a new development and demand for access to allow for this is growing. It would be easier if there was a simpler way to access materials so they could be more easily mined. However there are still outstanding questions on how authors of articles that have been mined can be properly attributed.
Resumo:
In June 2009 a study was completed that had been commissioned by Knowledge Exchange and written by Professor John Houghton, Victoria University, Australia. This report on the study was titled: "Open Access – What are the economic benefits? A comparison of the United Kingdom, Netherlands and Denmark." This report was based on the findings of studies in which John Houghton had modelled the costs and benefits of Open Access in three countries. These studies had been undertaken in the UK by JISC, in the Netherlands by SURF and in Denmark by DEFF. In the three national studies the costs and benefits of scholarly communication were compared based on three different publication models. The modelling revealed that the greatest advantage would be offered by the Open Access model, which means that the research institution or the party financing the research pays for publication and the article is then freely accessible. Adopting this model could lead to annual savings of around EUR 70 million in Denmark, EUR 133 million in The Netherlands and EUR 480 in the UK. The report concludes that the advantages would not just be in the long term; in the transitional phase too, more open access to research results would have positive effects. In this case the benefits would also outweigh the costs.
Resumo:
Context: Mobile applications support a set of user-interaction features that are independent of the application logic. Rotating the device, scrolling, or zooming are examples of such features. Some bugs in mobile applications can be attributed to user-interaction features. Objective: This paper proposes and evaluates a bug analyzer based on user-interaction features that uses digital image processing to find bugs. Method: Our bug analyzer detects bugs by comparing the similarity between images taken before and after a user-interaction. SURF, an interest point detector and descriptor, is used to compare the images. To evaluate the bug analyzer, we conducted a case study with 15 randomly selected mobile applications. First, we identified user-interaction bugs by manually testing the applications. Images were captured before and after applying each user-interaction feature. Then, image pairs were processed with SURF to obtain interest points, from which a similarity percentage was computed, to finally decide whether there was a bug. Results: We performed a total of 49 user-interaction feature tests. When manually testing the applications, 17 bugs were found, whereas when using image processing, 15 bugs were detected. Conclusions: 8 out of 15 mobile applications tested had bugs associated to user-interaction features. Our bug analyzer based on image processing was able to detect 88% (15 out of 17) of the user-interaction bugs found with manual testing.