969 resultados para handheld computer


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed and collaborative data stream mining in a mobile computing environment is referred to as Pocket Data Mining PDM. Large amounts of available data streams to which smart phones can subscribe to or sense, coupled with the increasing computational power of handheld devices motivates the development of PDM as a decision making system. This emerging area of study has shown to be feasible in an earlier study using technological enablers of mobile software agents and stream mining techniques [1]. A typical PDM process would start by having mobile agents roam the network to discover relevant data streams and resources. Then other (mobile) agents encapsulating stream mining techniques visit the relevant nodes in the network in order to build evolving data mining models. Finally, a third type of mobile agents roam the network consulting the mining agents for a final collaborative decision, when required by one or more users. In this paper, we propose the use of distributed Hoeffding trees and Naive Bayes classifers in the PDM framework over vertically partitioned data streams. Mobile policing, health monitoring and stock market analysis are among the possible applications of PDM. An extensive experimental study is reported showing the effectiveness of the collaborative data mining with the two classifers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

How can organizations use digital infrastructure to realise physical outcomes? The design and construction of London Heathrow Terminal 5 is analysed to build new theoretical understanding of visualization and materialization practices in the transition from digital design to physical realisation. In the project studied, an integrated software solution is introduced as an infrastructure for delivery. The analyses articulate the work done to maintain this digital infrastructure and also to move designs beyond the closed world of the computer to a physical reality. In changing medium, engineers use heterogeneous trials to interrogate and address the limitations of an integrated digital model. The paper explains why such trials, which involve the reconciliation of digital and physical data through parallel and iterative forms of work, provide a robust practice for realizing goals that have physical outcomes. It argues that this practice is temporally different from, and at times in conflict with, building a comprehensive dataset within the digital medium. The paper concludes by discussing the implications for organizations that use digital infrastructures in seeking to accomplish goals in digital and physical media.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Organizations introduce acceptable use policies to deter employee computer misuse. Despite the controlling, monitoring and other forms of interventions employed, some employees misuse the organizational computers to carry out their personal work such as sending emails, surfing internet, chatting, playing games etc. These activities not only waste productive time of employees but also bring a risk to the organization. A questionnaire was administrated to a random sample of employees selected from large and medium scale software development organizations, which measured the work computer misuse levels and the factors that influence such behavior. The presence of guidelines provided no evidence of significant effect on the level of employee computer misuse. Not having access to Internet /email away from work and organizational settings were identified to be the most significant influences of work computer misuse.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We are sympathetic with Bentley et al’s attempt to encompass the wisdom of crowds in a generative model, but posit that success at using Big Data will include more sensitive measurements, more and more varied sources of information, as well as build from the indirect information available through technology, from ancillary technical features to data from brain-computer interface.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Theorem-proving is a one-player game. The history of computer programs being the players goes back to 1956 and the ‘LT’ LOGIC THEORY MACHINE of Newell, Shaw and Simon. In game-playing terms, the ‘initial position’ is the core set of axioms chosen for the particular logic and the ‘moves’ are the rules of inference. Now, the Univalent Foundations Program at IAS Princeton and the resulting ‘HoTT’ book on Homotopy Type Theory have demonstrated the success of a new kind of experimental mathematics using computer theorem proving.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Point and click interactions using a mouse are an integral part of computer use for current desktop systems. Compared with younger users though, older adults experience greater difficulties performing cursor positioning tasks, and this can present limitations to using a computer easily and effectively. Target expansion is a technique for improving pointing performance, where the target dynamically grows as the cursor approaches. This has the advantage that targets conserve screen real estate in their unexpanded state, yet can still provide the benefits of a larger area to click on. This paper presents two studies of target expansion with older and younger participants, involving multidirectional point-select tasks with a computer mouse. Study 1 compares static versus expanding targets, and Study 2 compares static targets with three alternative techniques for expansion. Results show that expansion can improve times by up to 14%, and reduce error rates by up to 50%. Additionally, expanding targets are beneficial even when the expansion happens late in the movement, i.e. after the cursor has reached the expanded target area or even after it has reached the original target area. Participants’ subjective feedback on the target expansion are generally favorable, and this lends further support for the technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 20th World Computer Chess Championship took place in Yokohama, Japan during August 2013. It was narrowly won by JUNIOR from JONNY with HIARCS, PANDIX, SHREDDER and MERLIN occupying the remaining positions. There are references to the detailed chess biographies of the engines and engine-authors in the Chessprogramming Wiki. The games, occasionally annotated, are available here.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a method of simulating both the avalanche and surge components of pyroclastic flows generated by lava collapsing from a growing Pelean dome. This is used to successfully model the pyroclastic flows generated on 12 May 1996 by the Soufriere Hills volcano, Montserrat. In simulating the avalanche component we use a simple 3-fold parameterisation of flow acceleration for which we choose values using an inverse method. The surge component is simulated by a 1D hydraulic balance of sedimentation of clasts and entrainment of air away from the avalanche source. We show how multiple simulations based on uncertainty of the starting conditions and parameters, specifically location and size (mass flux), could be used to map hazard zones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an intuitive geometric approach for analysing the structure and fragility of T1-weighted structural MRI scans of human brains. Apart from computing characteristics like the surface area and volume of regions of the brain that consist of highly active voxels, we also employ Network Theory in order to test how close these regions are to breaking apart. This analysis is used in an attempt to automatically classify subjects into three categories: Alzheimer’s disease, mild cognitive impairment and healthy controls, for the CADDementia Challenge.