922 resultados para Computer arithmetic and logic units


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The mental logic theory does not accept the disjunction introduction rule of standard propositional calculus as a natural schema of the human mind. In this way, the problem that I want to show in this paper is that, however, that theory does admit another much more complex schema in which the mentioned rule must be used as a previous step. So, I try to argue that this is a very important problem that the mental logic theory needs to solve, and claim that another rival theory, the mental models theory, does not have these difficulties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

International audience

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, new computers generation provides a high performance that enables to build computationally expensive computer vision applications applied to mobile robotics. Building a map of the environment is a common task of a robot and is an essential part to allow the robots to move through these environments. Traditionally, mobile robots used a combination of several sensors from different technologies. Lasers, sonars and contact sensors have been typically used in any mobile robotic architecture, however color cameras are an important sensor due to we want the robots to use the same information that humans to sense and move through the different environments. Color cameras are cheap and flexible but a lot of work need to be done to give robots enough visual understanding of the scenes. Computer vision algorithms are computational complex problems but nowadays robots have access to different and powerful architectures that can be used for mobile robotics purposes. The advent of low-cost RGB-D sensors like Microsoft Kinect which provide 3D colored point clouds at high frame rates made the computer vision even more relevant in the mobile robotics field. The combination of visual and 3D data allows the systems to use both computer vision and 3D processing and therefore to be aware of more details of the surrounding environment. The research described in this thesis was motivated by the need of scene mapping. Being aware of the surrounding environment is a key feature in many mobile robotics applications from simple robotic navigation to complex surveillance applications. In addition, the acquisition of a 3D model of the scenes is useful in many areas as video games scene modeling where well-known places are reconstructed and added to game systems or advertising where once you get the 3D model of one room the system can add furniture pieces using augmented reality techniques. In this thesis we perform an experimental study of the state-of-the-art registration methods to find which one fits better to our scene mapping purposes. Different methods are tested and analyzed on different scene distributions of visual and geometry appearance. In addition, this thesis proposes two methods for 3d data compression and representation of 3D maps. Our 3D representation proposal is based on the use of Growing Neural Gas (GNG) method. This Self-Organizing Maps (SOMs) has been successfully used for clustering, pattern recognition and topology representation of various kind of data. Until now, Self-Organizing Maps have been primarily computed offline and their application in 3D data has mainly focused on free noise models without considering time constraints. Self-organising neural models have the ability to provide a good representation of the input space. In particular, the Growing Neural Gas (GNG) is a suitable model because of its flexibility, rapid adaptation and excellent quality of representation. However, this type of learning is time consuming, specially for high-dimensional input data. Since real applications often work under time constraints, it is necessary to adapt the learning process in order to complete it in a predefined time. This thesis proposes a hardware implementation leveraging the computing power of modern GPUs which takes advantage of a new paradigm coined as General-Purpose Computing on Graphics Processing Units (GPGPU). Our proposed geometrical 3D compression method seeks to reduce the 3D information using plane detection as basic structure to compress the data. This is due to our target environments are man-made and therefore there are a lot of points that belong to a plane surface. Our proposed method is able to get good compression results in those man-made scenarios. The detected and compressed planes can be also used in other applications as surface reconstruction or plane-based registration algorithms. Finally, we have also demonstrated the goodness of the GPU technologies getting a high performance implementation of a CAD/CAM common technique called Virtual Digitizing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Physiological signals, which are controlled by the autonomic nervous system (ANS), could be used to detect the affective state of computer users and therefore find applications in medicine and engineering. The Pupil Diameter (PD) seems to provide a strong indication of the affective state, as found by previous research, but it has not been investigated fully yet. In this study, new approaches based on monitoring and processing the PD signal for off-line and on-line affective assessment (“relaxation” vs. “stress”) are proposed. Wavelet denoising and Kalman filtering methods are first used to remove abrupt changes in the raw Pupil Diameter (PD) signal. Then three features (PDmean, PDmax and PDWalsh) are extracted from the preprocessed PD signal for the affective state classification. In order to select more relevant and reliable physiological data for further analysis, two types of data selection methods are applied, which are based on the paired t-test and subject self-evaluation, respectively. In addition, five different kinds of the classifiers are implemented on the selected data, which achieve average accuracies up to 86.43% and 87.20%, respectively. Finally, the receiver operating characteristic (ROC) curve is utilized to investigate the discriminating potential of each individual feature by evaluation of the area under the ROC curve, which reaches values above 0.90. For the on-line affective assessment, a hard threshold is implemented first in order to remove the eye blinks from the PD signal and then a moving average window is utilized to obtain the representative value PDr for every one-second time interval of PD. There are three main steps for the on-line affective assessment algorithm, which are preparation, feature-based decision voting and affective determination. The final results show that the accuracies are 72.30% and 73.55% for the data subsets, which were respectively chosen using two types of data selection methods (paired t-test and subject self-evaluation). In order to further analyze the efficiency of affective recognition through the PD signal, the Galvanic Skin Response (GSR) was also monitored and processed. The highest affective assessment classification rate obtained from GSR processing is only 63.57% (based on the off-line processing algorithm). The overall results confirm that the PD signal should be considered as one of the most powerful physiological signals to involve in future automated real-time affective recognition systems, especially for detecting the “relaxation” vs. “stress” states.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The application of modern ICT technologies is radically changing many fields pushing toward more open and dynamic value chains fostering the cooperation and integration of many connected partners, sensors, and devices. As a valuable example, the emerging Smart Tourism field derived from the application of ICT to Tourism so to create richer and more integrated experiences, making them more accessible and sustainable. From a technological viewpoint, a recurring challenge in these decentralized environments is the integration of heterogeneous services and data spanning multiple administrative domains, each possibly applying different security/privacy policies, device and process control mechanisms, service access, and provisioning schemes, etc. The distribution and heterogeneity of those sources exacerbate the complexity in the development of integrating solutions with consequent high effort and costs for partners seeking them. Taking a step towards addressing these issues, we propose APERTO, a decentralized and distributed architecture that aims at facilitating the blending of data and services. At its core, APERTO relies on APERTO FaaS, a Serverless platform allowing fast prototyping of the business logic, lowering the barrier of entry and development costs to newcomers, (zero) fine-grained scaling of resources servicing end-users, and reduced management overhead. APERTO FaaS infrastructure is based on asynchronous and transparent communications between the components of the architecture, allowing the development of optimized solutions that exploit the peculiarities of distributed and heterogeneous environments. In particular, APERTO addresses the provisioning of scalable and cost-efficient mechanisms targeting: i) function composition allowing the definition of complex workloads from simple, ready-to-use functions, enabling smarter management of complex tasks and improved multiplexing capabilities; ii) the creation of end-to-end differentiated QoS slices minimizing interfaces among application/service running on a shared infrastructure; i) an abstraction providing uniform and optimized access to heterogeneous data sources, iv) a decentralized approach for the verification of access rights to resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation investigates the relations between logic and TCS in the probabilistic setting. It is motivated by two main considerations. On the one hand, since their appearance in the 1960s-1970s, probabilistic models have become increasingly pervasive in several fast-growing areas of CS. On the other, the study and development of (deterministic) computational models has considerably benefitted from the mutual interchanges between logic and CS. Nevertheless, probabilistic computation was only marginally touched by such fruitful interactions. The goal of this thesis is precisely to (start) bring(ing) this gap, by developing logical systems corresponding to specific aspects of randomized computation and, therefore, by generalizing standard achievements to the probabilistic realm. To do so, our key ingredient is the introduction of new, measure-sensitive quantifiers associated with quantitative interpretations. The dissertation is tripartite. In the first part, we focus on the relation between logic and counting complexity classes. We show that, due to our classical counting propositional logic, it is possible to generalize to counting classes, the standard results by Cook and Meyer and Stockmeyer linking propositional logic and the polynomial hierarchy. Indeed, we show that the validity problem for counting-quantified formulae captures the corresponding level in Wagner's hierarchy. In the second part, we consider programming language theory. Type systems for randomized \lambda-calculi, also guaranteeing various forms of termination properties, were introduced in the last decades, but these are not "logically oriented" and no Curry-Howard correspondence is known for them. Following intuitions coming from counting logics, we define the first probabilistic version of the correspondence. Finally, we consider the relationship between arithmetic and computation. We present a quantitative extension of the language of arithmetic able to formalize basic results from probability theory. This language is also our starting point to define randomized bounded theories and, so, to generalize canonical results by Buss.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although benign epilepsy with centrotemporal spikes (BECTS) is an idiopathic, age-related epilepsy syndrome with favorable outcome, recent studies have shown impairment in specific neuropsychological tests. The objective of this study was to analyze the comorbidity between dyslexia and BECTS. Thirty-one patients with clinical and electroencephalographic diagnosis of BECTS (group A) and 31 paired children (group B) underwent a language and neuropsychological assessment performed with several standardized protocols. Our findings were categorized as: a) dyslexia; b) other difficulties; c) without difficulties. Our results were compared and statistically analyzed. Our data showed that dyslexia occurred in 19.4% and other difficulties in 74.2% of our patients. This was highly significant when compared with the control group (p<0.001). Phonological awareness, writing, reading, arithmetic, and memory tests showed a statistically significant difference when comparing both groups. Our findings show significant evidence of the occurrence of dyslexia in patients with BECTS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network. Copyright (c) 2008 J. R. C. Piqueira and F. B. Cesar. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A multi-pumping flow system exploiting prior assay is proposed for sequential turbidimetric determination of sulphate and chloride in natural waters. Both methods are implemented in the same manifold that provides facilities for: in-line sample clean-up with a Bio-Rex 70 mini-column with fluidized beads: addition of low amounts of sulphate or chloride ions to the reaction medium for improving supersaturation; analyte precipitation with Ba(2+) or Ag(+); real-time decision on the need for next assay. The sample is initially run for chloride determination, and the analytical signal is compared with a preset value. If higher, the sample is run again, now for sulphate determination. The strategy may lead to all increased sample throughput. The proposed system is computer-controlled and presents enhanced figures of merit. About 10 samples are run per hour (about 60 measurements) and results are reproducible and Unaffected by the presence of potential interfering ions at concentration levels usually found in natural waters. Accuracy was assessed against ion chromatography. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ceriporiopsis subvermispora is a white-rot fungus used in biopulping processes and seems to use the fatty acid peroxidation reactions initiated by manganese-peroxidase (MnP) to start lignin degradation. The present work shows that C. subvermispora was able to peroxidize unsaturated fatty acids during wood biotreatment under biopulping conditions. In vitro assays showed that the extent of linoleic acid peroxidation was positively correlated with the level of MnP recovered from the biotreated wood chips. Milled wood was treated in vitro by partially purified MnP and linoleic acid. UV spectroscopy and size exclusion chromatography (SEC) showed that soluble compounds similar to lignin were released from the milled wood. SEC data showed a broad elution profile compatible with low molar mass lignin fractions. MnP-treated milled wood was analyzed by thioacidolysis. The yield of thioacidolysis monomers recovered from guaiacyl and syringyl units decreased by 33% and 20% in MnP-treated milled wood, respectively. This has suggested that lignin depolymerization reactions have occurred during the MnP/linoleic acid treatment. (C) 2009 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design, construction, and characterization of a portable opto-coupled potentiostat are presented. The potentiostat is battery-powered, managed by a microcontroller, which implements cyclic voltammetry (CV) using suitable sensor electrodes. Its opto-coupling permits a wide range of current measurements, varying from mA to nA. Two software interfaces were developed to perform the CV measurement: a virtual instrument for a personal computer (PC) and a C-base interface for personal digital assistant (PDA). The potentiostat has been evaluated by detection of potassium ferrocyanide in KCl medium, both with macro and microelectrodes. There was good agreement between the instrumental results and those from commercial equipment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer viruses are an important risk to computational systems endangering either corporations of all sizes or personal computers used for domestic applications. Here, classical epidemiological models for disease propagation are adapted to computer networks and, by using simple systems identification techniques a model called SAIC (Susceptible, Antidotal, Infectious, Contaminated) is developed. Real data about computer viruses are used to validate the model. (c) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This pilot project at Cotton Tree, Maroochydore, on two adjacent, linear parcels of land has one of the properties privately owned while the other is owned by the public housing authority. Both owners commissioned Lindsay and Kerry Clare to design housing for their separate needs which enabled the two projects to be governed by a single planning and design strategy. This entailed the realignment of the dividing boundary to form two approximately square blocks which made possible the retention of an important stand of mature paperbark trees and gave each block a more useful street frontage. The scheme provides seven two-bedroom units and one single-bedroom unit as the private component, with six single-bedroom units, three two-bedroom units and two three-bedroom units forming the public housing. The dwellings are deployed as an interlaced mat of freestanding blocks, car courts, courtyard gardens, patios and decks. The key distinction between the public and private parts of the scheme is the pooling of the car parking spaces in the public housing to create a shared courtyard. The housing climbs to three storeys on its southern edge and falls to a single storey on the north-western corner. This enables all units and the principal private outdoor spaces to have a northern orientation. The interiors of both the public and private units are skilfully arranged to take full advantage of views, light and breeze.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This pilot project at Cotton Tree, Maroochydore, on two adjacent, linear parcels of land has one of the properties privately owned while the other is owned by the public housing authority. Both owners commissioned Lindsay and Kerry Clare to design housing for their separate needs which enabled the two projects to be governed by a single planning and design strategy. This entailed the realignment of the dividing boundary to form two approximately square blocks which made possible the retention of an important stand of mature paperbark trees and gave each block a more useful street frontage. The scheme provides seven two-bedroom units and one single-bedroom unit as the private component, with six single-bedroom units, three two-bedroom units and two three-bedroom units forming the public housing. The dwellings are deployed as an interlaced mat of freestanding blocks, car courts, courtyard gardens, patios and decks. The key distinction between the public and private parts of the scheme is the pooling of the car parking spaces in the public housing to create a shared courtyard. The housing climbs to three storeys on its southern edge and falls to a single storey on the north-western corner. This enables all units and the principal private outdoor spaces to have a northern orientation. The interiors of both the public and private units are skilfully arranged to take full advantage of views, light and breeze.