767 resultados para sensor interfaces


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study proposes a method based on ski fixed inertial sensors to automatically compute spatio-temporal parameters (phase durations, cycle speed and cycle length) for the diagonal stride in classical cross-country skiing. The proposed system was validated against a marker-based motion capture system during indoor treadmill skiing. Skiing movement of 10 junior to world-cup athletes was measured for four different conditions. The accuracy (i.e. median error) and precision (i.e. interquartile range of error) of the system was below 6ms for cycle duration and ski thrust duration and below 35ms for pole push duration. Cycle speed precision (accuracy) was below 0.1m/s (0.005m/s) and cycle length precision (accuracy) was below 0.15m (0.005m). The system was sensitive to changes of conditions and was accurate enough to detect significant differences reported in previous studies. Since capture volume is not limited and setup is simple, the system would be well suited for outdoor measurements on snow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aquest projecte s’emmarca dins de l’àmbit de la visió per computador, concretament en la utilització de dades de profunditat obtingudes a través d’un emissor i sensor de llum infraroja.El propòsit principal d’aquest projecte és mostrar com adaptar aquestes tecnologies, a l’abast de qualsevol particular, de forma que un usuari durant la pràctica d’una activitat esportiva concreta, rebi informació visual continua dels moviments i gestos incorrectes que està realitzant, en base a uns paràmetres prèviament establerts.L’objectiu d’aquest projecte consisteix en fer una lectura constant en temps real d’una persona practicant una selecció de diverses activitats esportives estàtiques utilitzant un sensor Kinect. A través de les dades obtingudes pel sensor Kinect i utilitzant les llibreries de “skeleton traking” proporcionades per Microsoft s’haurà d’interpretar les dades posturals obtingudes per cada tipus d’esport i indicar visualment i d’una manera intuïtiva els errors que està cometent en temps real, de manera que es vegi clarament a quina part del seu cos realitza un moviment incorrecte per tal de poder corregir-lo ràpidament. El entorn de desenvolupament que s’utilitza per desenvolupar aquesta aplicació es Microsoft Viusal Studio 2010.El llenguatge amb el qual es treballarà sobre Microsoft Visual Studio 2010 és C#

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, Wireless Sensor Networks (WSN) arealready a very important data source to obtain data about the environment. Thus, they are key to the creation of Cyber-Physical Systems (CPS). Given the popularity of P2P middlewares as ameans to efficiently process information and distribute services, being able to integrate them to WSN¿s is an interesting proposal. JXTA is a widely used P2P middleware that allows peers to easily exchange information, heavily relying on its main architectural highlight, the capability to organize peers with common interests into peer groups. However, right now, approaches to integrate WSNs to a JXTA network seldom take advantage of peer groups. For this reason, in this paper we present jxSensor, an integrationlayer for sensor motes which facilitates the deployment of CPS¿s under this architecture. This integration has been done taking into account JXTA¿s idiosyncrasies and proposing novel ideas,such as the Virtual Peer, a group of sensors that acts as a single entity within the peer group context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes an experiment to be performed in both instrumental analysis and experimental physical-chemistry curricular disciplines in order to open options to develop challenging basic research activities. Thus the experimental procedures and the results obtained in the preparation of electrodeposited lead dioxide onto graphite and its evaluation as potentiometric sensor for H3O+ and Pb2+ ions, are presented. The data obtained in acid-base titrations were compared with those of the traditional combination glass electrode at the same conditions. Although a linear sub-Nernstian response to free hydrogen ions was observed for the electrodeposited PbO2 electrode, a good agreement was obtained between them. Working as lead(II) sensing electrode, the PbO2 showed a linear sub-Nernstian behavior at total Pb2+ concentrations ranging from 3,5 x 10-4 to 3,0 x 10-2 mol/L in nitrate media. For the redox couple PbO2/Pb(II) the operational slope converges to the theoretical one, as the acidity of the working solution increases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The neuronal calcium sensor proteins GCAPs (guanylate cyclase activating proteins) switch between Ca2+-free and Ca2+-bound conformational states and confer calcium sensitivity to guanylate cyclase at retinal photoreceptor cells. They play a fundamental role in light adaptation by coupling the rate of cGMP synthesis to the intracellular concentration of calcium. Mutations in GCAPs lead to blindness. The importance of functional EF-hands in GCAP1 for photoreceptor cell integrity has been well established. Mutations in GCAP1 that diminish its Ca2+ binding affinity lead to cell damage by causing unabated cGMP synthesis and accumulation of toxic levels of free cGMP and Ca2+. We here investigate the relevance of GCAP2 functional EF-hands for photoreceptor cell integrity. By characterizing transgenic mice expressing a mutant form of GCAP2 with all EF-hands inactivated (EF(-)GCAP2), we show that GCAP2 locked in its Ca2+-free conformation leads to a rapid retinal degeneration that is not due to unabated cGMP synthesis. We unveil that when locked in its Ca2+-free conformation in vivo, GCAP2 is phosphorylated at Ser201 and results in phospho-dependent binding to the chaperone 14-3-3 and retention at the inner segment and proximal cell compartments. Accumulation of phosphorylated EF(-)GCAP2 at the inner segment results in severe toxicity. We show that in wildtype mice under physiological conditions, 50% of GCAP2 is phosphorylated correlating with the 50% of the protein being retained at the inner segment. Raising mice under constant light exposure, however, drastically increases the retention of GCAP2 in its Ca2+-free form at the inner segment. This study identifies a new mechanism governing GCAP2 subcellular distribution in vivo, closely related to disease. It also identifies a pathway by which a sustained reduction in intracellular free Ca2+ could result in photoreceptor damage, relevant for light damage and for those genetic disorders resulting in 'equivalent-light'' scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simultaneous localization and mapping(SLAM) is a very important problem in mobile robotics. Many solutions have been proposed by different scientists during the last two decades, nevertheless few studies have considered the use of multiple sensors simultane¬ously. The solution is on combining several data sources with the aid of an Extended Kalman Filter (EKF). Two approaches are proposed. The first one is to use the ordinary EKF SLAM algorithm for each data source separately in parallel and then at the end of each step, fuse the results into one solution. Another proposed approach is the use of multiple data sources simultaneously in a single filter. The comparison of the computational com¬plexity of the two methods is also presented. The first method is almost four times faster than the second one.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the thesis the principle of work of eddy current position sensors and the main cautions that must be taken into account while sensor design process are explained. A way of automated eddy current position sensor electrical characteristics measurement is suggested. A prototype of the eddy current position sensor and its electrical characteristics are investigated. The results obtained by means of the automated measuring system are explained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of the quartz crystal microbalance process, electrochemical impedance spectroscopy and surface plasmon resonance for characterizing thin films and monitoring interfaces is presented. The theorical aspects of QCM, EIS and SPR are introduced and the main application areas are outlined. Future prospects of the combined applications of QCM, EIS and SPR methods in the studies of interfacial processes at surfaces are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nanotechnology can be viewed as a powerful tool, capable of shaping the chemistry of atoms and molecules, converting them into exciting nanosized and nanostructured materials, devices and machines. However, in pursuing this task, an exceptional ability is required to deal with complex inter- and multidisciplinary approaches, as imposed by the nanoscale. A new research organization framework, capable of promoting cooperative interactions in many complementary areas, including the industries, is demanded. In this sense, an interesting example are the nanotechnology networks and millenium institutes recently created in Brazil. The highlights and weakness of such cooperative research networks are discussed, in addition to relevant nanotechnology themes focusing on the special needs and resources from the developing nations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Brazilian chemical industries face several problems regarding Research, Development and Innovation (RDI). The present paper shows that simple cooperation between chemical industries and university laboratories can be a way to overcome some of the present difficulties. The work carried out at LABOCAT has several industrial interfaces. It involves, among other areas of RDI, the development of anti-HIV-protease (and other virus-related-protease) drugs, the establishment of new (industrial) chemical processes and the implementation of industrial (biodiesel and related) plants. A model based on the present so called RHAE programme is proposed in which, parallel to the fellowship awards of this programme, financing participation of Brazilian Agencies would cover process development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the Chemical Science is an experimental one a Chemical Industry require technical people in all its staff level: from Directors and Managers to Operators. This chemical and chemical engineering based education is the foundation of the innovate process and motivation. The paper discusses this and the role of Public Policies to improve the R&D and innovation in the Brazilian Chemical Industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Efficient problem solving in cellular networks is important when enhancing the network performance and liability. Analysis of calls and packet switched sessions in protocol level between the network elements is an important part of this process. They can provide very detailed information about error situations which otherwise would be difficult to recognise. In this thesis we seek solutions for monitoring GPRS/EDGE sessions in two specific interfaces simultaneously in such manner that all information important to the users will be provided in easily understandable form. This thesis focuses on Abis and AGPRS interfaces of GSM radio network and introduces a solution for managing the correlation between these interfaces by using signalling messages and common parameters as linking elements. ~: Finally this thesis presents an implementation of GPRS/EDGE session monitoring application for Abis and AGPRS interfaces and evaluates its benefits to the end users. Application is implemented as a part of Windows based 3G/GSM network analyser.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Major challenges must be tackled for brain-computer interfaces to mature into an established communications medium for VR applications, which will range from basic neuroscience studies to developing optimal peripherals and mental gamepads and more efficient brain-signal processing techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report the use of an optical fiber sensor to measure the soybean oil concentration in samples obtained from the mixture of pure biodiesel and commercial soybean oil. The operation of the device is based on the long-period grating sensitivity to the surrounding medium refractive index, which leads to measurable modifications in the grating transmission spectrum. The proposed analysis method results in errors in the oil concentration of 0.4% and 2.6% for pure biodiesel and commercial soybean oil, respectively. Techniques of total glycerol, dynamic viscosity, density, and hydrogen nuclear magnetic resonance spectroscopy were also employed to validate the proposed method.