978 resultados para number processing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a multilayered architecture that enhances the capabilities of current QA systems and allows different types of complex questions or queries to be processed. The answers to these questions need to be gathered from factual information scattered throughout different documents. Specifically, we designed a specialized layer to process the different types of temporal questions. Complex temporal questions are first decomposed into simple questions, according to the temporal relations expressed in the original question. In the same way, the answers to the resulting simple questions are recomposed, fulfilling the temporal restrictions of the original complex question. A novel aspect of this approach resides in the decomposition which uses a minimal quantity of resources, with the final aim of obtaining a portable platform that is easily extensible to other languages. In this paper we also present a methodology for evaluation of the decomposition of the questions as well as the ability of the implemented temporal layer to perform at a multilingual level. The temporal layer was first performed for English, then evaluated and compared with: a) a general purpose QA system (F-measure 65.47% for QA plus English temporal layer vs. 38.01% for the general QA system), and b) a well-known QA system. Much better results were obtained for temporal questions with the multilayered system. This system was therefore extended to Spanish and very good results were again obtained in the evaluation (F-measure 40.36% for QA plus Spanish temporal layer vs. 22.94% for the general QA system).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Gaia-ESO Survey is a large public spectroscopic survey that aims to derive radial velocities and fundamental parameters of about 105 Milky Way stars in the field and in clusters. Observations are carried out with the multi-object optical spectrograph FLAMES, using simultaneously the medium-resolution (R ~ 20 000) GIRAFFE spectrograph and the high-resolution (R ~ 47 000) UVES spectrograph. In this paper we describe the methods and the software used for the data reduction, the derivation of the radial velocities, and the quality control of the FLAMES-UVES spectra. Data reduction has been performed using a workflow specifically developed for this project. This workflow runs the ESO public pipeline optimizing the data reduction for the Gaia-ESO Survey, automatically performs sky subtraction, barycentric correction and normalisation, and calculates radial velocities and a first guess of the rotational velocities. The quality control is performed using the output parameters from the ESO pipeline, by a visual inspection of the spectra and by the analysis of the signal-to-noise ratio of the spectra. Using the observations of the first 18 months, specifically targets observed multiple times at different epochs, stars observed with both GIRAFFE and UVES, and observations of radial velocity standards, we estimated the precision and the accuracy of the radial velocities. The statistical error on the radial velocities is σ ~ 0.4 km s-1 and is mainly due to uncertainties in the zero point of the wavelength calibration. However, we found a systematic bias with respect to the GIRAFFE spectra (~0.9 km s-1) and to the radial velocities of the standard stars (~0.5 km s-1) retrieved from the literature. This bias will be corrected in the future data releases, when a common zero point for all the set-ups and instruments used for the survey is be established.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Paper submitted to Euromicro Symposium on Digital Systems Design (DSD), Belek-Antalya, Turkey, 2003.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A specialised reconfigurable architecture is targeted at wireless base-band processing. It is built to cater for multiple wireless standards. It has lower power consumption than the processor-based solution. It can be scaled to run in parallel for processing multiple channels. Test resources are embedded on the architecture and testing strategies are included. This architecture is functionally partitioned according to the common operations found in wireless standards, such as CRC error correction, convolution and interleaving. These modules are linked via Virtual Wire Hardware modules and route-through switch matrices. Data can be processed in any order through this interconnect structure. Virtual Wire ensures the same flexibility as normal interconnects, but the area occupied and the number of switches needed is reduced. The testing algorithm scans all possible paths within the interconnection network exhaustively and searches for faults in the processing modules. The testing algorithm starts by scanning the externally addressable memory space and testing the master controller. The controller then tests every switch in the route-through switch matrix by making loops from the shared memory to each of the switches. The local switch matrix is also tested in the same way. Next the local memory is scanned. Finally, pre-defined test vectors are loaded into local memory to check the processing modules. This paper compares various base-band processing solutions. It describes the proposed platform and its implementation. It outlines the test resources and algorithm. It concludes with the mapping of Bluetooth and GSM base-band onto the platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Caffeine is known to increase arousal, attention, and information processing-all factors implicated in facilitating persuasion. In a standard attitude-change paradigm, participants consumed an orange-juice drink that either contained caffeine (3.5 mg/kg body weight) or did not (placebo) prior to reading a counterattitudinal communication (anti-voluntary euthanasia). Participants then completed a thought-listing task and a number of attitude scales. The first experiment showed that those who consumed caffeine showed greater agreement with the communication (direct attitude: voluntary euthanasia) and on an issue related to, but not contained in, the communication (indirect attitude: abortion). The order in which direct and indirect attitudes were measured did not affect the results. A second experiment manipulated the quality of the arguments in the message (strong vs. weak) to determine whether systematic processing had occurred. There was evidence that systematic processing occurred in both drink conditions, but was greater for those who had consumed caffeine. In both experiments, the amount of message-congruent thinking mediated persuasion. These results show that caffeine can increase the extent to which people systematically process and are influenced by a persuasive communication.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantile computation has many applications including data mining and financial data analysis. It has been shown that an is an element of-approximate summary can be maintained so that, given a quantile query d (phi, is an element of), the data item at rank [phi N] may be approximately obtained within the rank error precision is an element of N over all N data items in a data stream or in a sliding window. However, scalable online processing of massive continuous quantile queries with different phi and is an element of poses a new challenge because the summary is continuously updated with new arrivals of data items. In this paper, first we aim to dramatically reduce the number of distinct query results by grouping a set of different queries into a cluster so that they can be processed virtually as a single query while the precision requirements from users can be retained. Second, we aim to minimize the total query processing costs. Efficient algorithms are developed to minimize the total number of times for reprocessing clusters and to produce the minimum number of clusters, respectively. The techniques are extended to maintain near-optimal clustering when queries are registered and removed in an arbitrary fashion against whole data streams or sliding windows. In addition to theoretical analysis, our performance study indicates that the proposed techniques are indeed scalable with respect to the number of input queries as well as the number of items and the item arrival rate in a data stream.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The biological underpinnings of human intelligence remain enigmatic. There remains the greatest confusion and controversy regarding mechanisms that enable humans to conceptualize, plan, and prioritize, and why they are set apart from other animals in their cognitive abilities. Here we demonstrate that the basic neuronal building block of the cerebral cortex, the pyramidal cell, is characterized by marked differences in structure among primate species. Moreover, comparison of the complexity of neuron structure with the size of the cortical area/region in which the cells are located revealed that trends in the granular prefrontal cortex (gPFC) were dramatically different to those in visual cortex. More specifically, pyramidal cells in the gPFC of humans had a disproportionately high number of spines. As neuron structure determines both its biophysical properties and connectivity, differences in the complexity in dendritic structure observed here endow neurons with different computational abilities. Furthermore, cortical circuits composed of neurons with distinguishable morphologies will likely be characterized by different functional capabilities. We propose that 1. circuitry in V1, V2, and gPFC within any given species differs in its functional capabilities and 2. there are dramatic differences in the functional capabilities of gPFC circuitry in different species, which are central to the different cognitive styles of primates. In particular, the highly branched, spinous neurons in the human gPFC may be a key component of human intelligence. (C) 2005 Wiley-Liss, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To investigate the stability of trace reactivation in healthy older adults, 22 older volunteers with no significant neurological history participated in a cross-modal priming task. Whilst both object relative center embedded (ORC) and object relative right branching (ORR) sentences is-ere employed, working memory load was reduced by limiting the number of wordy separating the antecedent front the gap for both sentence types. Analysis of the results did not reveal any significant trace reactivation for the ORC or ORR sentences. The results did reveal, however, a positive correlation between age and semantic printing at the pre-gap position and a negative correlation between age and semantic printing at the gap position for ORC sentences. In contrast, there was no correlation between age and priming effects for the ORR sentences. These results indicated that trace reactivation may be sensitive to a variety of age related factors, including lexical activation and working memory. The implications of these results for sentence processing in the older population arc discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A k-NN query finds the k nearest-neighbors of a given point from a point database. When it is sufficient to measure object distance using the Euclidian distance, the key to efficient k-NN query processing is to fetch and check the distances of a minimum number of points from the database. For many applications, such as vehicle movement along road networks or rover and animal movement along terrain surfaces, the distance is only meaningful when it is along a valid movement path. For this type of k-NN queries, the focus of efficient query processing is to minimize the cost of computing distances using the environment data (such as the road network data and the terrain data), which can be several orders of magnitude larger than that of the point data. Efficient processing of k-NN queries based on the Euclidian distance or the road network distance has been investigated extensively in the past. In this paper, we investigate the problem of surface k-NN query processing, where the distance is calculated from the shortest path along a terrain surface. This problem is very challenging, as the terrain data can be very large and the computational cost of finding shortest paths is very high. We propose an efficient solution based on multiresolution terrain models. Our approach eliminates the need of costly process of finding shortest paths by ranking objects using estimated lower and upper bounds of distance on multiresolution terrain models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A specialised reconfigurable architecture for telecommunication base-band processing is augmented with testing resources. The routing network is linked via virtual wire hardware modules to reduce the area occupied by connecting buses. The number of switches within the routing matrices is also minimised, which increases throughput without sacrificing flexibility. The testing algorithm was developed to systematically search for faults in the processing modules and the flexible high-speed routing network within the architecture. The testing algorithm starts by scanning the externally addressable memory space and testing the master controller. The controller then tests every switch in the route-through switch matrix by making loops from the shared memory to each of the switches. The local switch matrix is also tested in the same way. Next the local memory is scanned. Finally, pre-defined test vectors are loaded into local memory to check the processing modules. This algorithm scans all possible paths within the interconnection network exhaustively and reports all faults. Strategies can be inserted to bypass minor faults

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The physical implementation of quantum information processing is one of the major challenges of current research. In the last few years, several theoretical proposals and experimental demonstrations on a small number of qubits have been carried out, but a quantum computing architecture that is straightforwardly scalable, universal, and realizable with state-of-the-art technology is still lacking. In particular, a major ultimate objective is the construction of quantum simulators, yielding massively increased computational power in simulating quantum systems. Here we investigate promising routes towards the actual realization of a quantum computer, based on spin systems. The first one employs molecular nanomagnets with a doublet ground state to encode each qubit and exploits the wide chemical tunability of these systems to obtain the proper topology of inter-qubit interactions. Indeed, recent advances in coordination chemistry allow us to arrange these qubits in chains, with tailored interactions mediated by magnetic linkers. These act as switches of the effective qubit-qubit coupling, thus enabling the implementation of one- and two-qubit gates. Molecular qubits can be controlled either by uniform magnetic pulses, either by local electric fields. We introduce here two different schemes for quantum information processing with either global or local control of the inter-qubit interaction and demonstrate the high performance of these platforms by simulating the system time evolution with state-of-the-art parameters. The second architecture we propose is based on a hybrid spin-photon qubit encoding, which exploits the best characteristic of photons, whose mobility is exploited to efficiently establish long-range entanglement, and spin systems, which ensure long coherence times. The setup consists of spin ensembles coherently coupled to single photons within superconducting coplanar waveguide resonators. The tunability of the resonators frequency is exploited as the only manipulation tool to implement a universal set of quantum gates, by bringing the photons into/out of resonance with the spin transition. The time evolution of the system subject to the pulse sequence used to implement complex quantum algorithms has been simulated by numerically integrating the master equation for the system density matrix, thus including the harmful effects of decoherence. Finally a scheme to overcome the leakage of information due to inhomogeneous broadening of the spin ensemble is pointed out. Both the proposed setups are based on state-of-the-art technological achievements. By extensive numerical experiments we show that their performance is remarkably good, even for the implementation of long sequences of gates used to simulate interesting physical models. Therefore, the here examined systems are really promising buildingblocks of future scalable architectures and can be used for proof-of-principle experiments of quantum information processing and quantum simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Caffeine is known to increase arousal, attention, and information processing–all factors implicated in facilitating persuasion. In a standard attitude-change paradigm, participants consumed an orange-juice drink that either contained caffeine (3.5 mg/kg body weight) or did not (placebo) prior to reading a counterattitudinal communication (anti-voluntary euthanasia). Participants then completed a thought-listing task and a number of attitude scales. The first experiment showed that those who consumed caffeine showed greater agreement with the communication (direct attitude: voluntary euthanasia) and on an issue related to, but not contained in, the communication (indirect attitude: abortion). The order in which direct and indirect attitudes were measured did not affect the results. A second experiment manipulated the quality of the arguments in the message (strong vs. weak) to determine whether systematic processing had occurred. There was evidence that systematic processing occurred in both drink conditions, but was greater for those who had consumed caffeine. In both experiments, the amount of message-congruent thinking mediated persuasion. These results show that caffeine can increase the extent to which people systematically process and arc influenced by a persuasive communication.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the major problems associated with communication via a loudspeaking telephone (LST) is that, using analogue processing, duplex transmission is limited to low-loss lines and produces a low acoustic output. An architectural for an instrument has been developed and tested, which uses digital signal processing to provide duplex transmission between a LST and a telopnone handset over most of the B.T. network. Digital adaptive-filters are used in the duplex LST to cancel coupling between the loudspeaker and microphone, and across the transmit to receive paths of the 2-to-4-wire converter. Normal movement of a person in the acoustic path causes a loss of stability by increasing the level of coupling from the loudspeaker to the microphone, since there is a lag associated the adaptive filters learning about a non-stationary path, Control of the loop stability and the level of sidetone heard by the hadset user is by a microprocessoe, which continually monitors the system and regulates the gain. The result is a system which offers the best compromise available based on a set of measured parameters.A theory has been developed which gives the loop stability requirements based on the error between the parameters of the filter and those of the unknown path. The programme to develope a low-cost adaptive filter in LST produced a low-cost adaptive filter in LST produced a unique architecture which has a number of features not available in any similar system. These include automatic compensation for the rate of adaptation over a 36 dB range of output level, , 4 rates of adaptation (with a maximum of 465 dB/s), plus the ability to cascade up to 4 filters without loss o performance. A complex story has been developed to determine the adptation which can be achieved using finite-precision arithmatic. This enabled the development of an architecture which distributed the normalisation required to achieve optimum rate of adaptation over the useful input range. Comparison of theory and measurement for the adaptive filter show very close agreement. A single experimental LST was built and tested on connections to hanset telephones over the BT network. The LST demonstrated that duplex transmission was feasible using signal processing and produced a more comfortable means of communication beween people than methods emplying deep voice-switching to regulate the local-loop gain. Although, with the current level of processing power, it is not a panacea and attention must be directed toward the physical acoustic isolation between loudspeaker and microphone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes the design and engineering of a pressurised biomass gasification test facility. A detailed examination of the major elements within the plant has been undertaken in relation to specification of equipment, evaluation of options and final construction. The retrospective project assessment was developed from consideration of relevant literature and theoretical principles. The literature review includes a discussion on legislation and applicable design codes. From this analysis, each of the necessary equipment units was reviewed and important design decisions and procedures highlighted and explored. Particular emphasis was placed on examination of the stringent demands of the ASME VIII design codes. The inter-relationship of functional units was investigated and areas of deficiency, such as biomass feeders and gas cleaning, have been commented upon. Finally, plant costing was summarized in relation to the plant design and proposed experimental programme. The main conclusion drawn from the study is that pressurised gasification of biomass is far more difficult and expensive to support than atmospheric gasification. A number of recommendations have been made regarding future work in this area.