888 resultados para Parallel processing (Electronic computers) - Research
Resumo:
Very large spatially-referenced datasets, for example, those derived from satellite-based sensors which sample across the globe or large monitoring networks of individual sensors, are becoming increasingly common and more widely available for use in environmental decision making. In large or dense sensor networks, huge quantities of data can be collected over small time periods. In many applications the generation of maps, or predictions at specific locations, from the data in (near) real-time is crucial. Geostatistical operations such as interpolation are vital in this map-generation process and in emergency situations, the resulting predictions need to be available almost instantly, so that decision makers can make informed decisions and define risk and evacuation zones. It is also helpful when analysing data in less time critical applications, for example when interacting directly with the data for exploratory analysis, that the algorithms are responsive within a reasonable time frame. Performing geostatistical analysis on such large spatial datasets can present a number of problems, particularly in the case where maximum likelihood. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively. Most modern commodity hardware has at least 2 processor cores if not more. Other mechanisms for allowing parallel computation such as Grid based systems are also becoming increasingly commonly available. However, currently there seems to be little interest in exploiting this extra processing power within the context of geostatistics. In this paper we review the existing parallel approaches for geostatistics. By recognising that diffeerent natural parallelisms exist and can be exploited depending on whether the dataset is sparsely or densely sampled with respect to the range of variation, we introduce two contrasting novel implementations of parallel algorithms based on approximating the data likelihood extending the methods of Vecchia [1988] and Tresp [2000]. Using parallel maximum likelihood variogram estimation and parallel prediction algorithms we show that computational time can be significantly reduced. We demonstrate this with both sparsely sampled data and densely sampled data on a variety of architectures ranging from the common dual core processor, found in many modern desktop computers, to large multi-node super computers. To highlight the strengths and weaknesses of the diffeerent methods we employ synthetic data sets and go on to show how the methods allow maximum likelihood based inference on the exhaustive Walker Lake data set.
Resumo:
This thesis discusses the need for nondestructive testing and highlights some of the limitations in present day techniques. Special interest has been given to ultrasonic examination techniques and the problems encountered when they are applied to thick welded plates. Some suggestions are given using signal processing methods. Chapter 2 treats the need for nondestructive testing as seen in the light of economy and safety. A short review of present day techniques in nondestructive testing is also given. The special problems using ultrasonic techniques for welded structures is discussed in Chapter 3 with some examples of elastic wave propagation in welded steel. The limitations in applying sophisticated signal processing techniques to ultrasonic NDT~ mainly found in the transducers generating or receiving the ultrasound. Chapter 4 deals with the different transducers used. One of the difficulties with ultrasonic testing is the interpretation of the signals encountered. Similar problems might be found with SONAR/RADAR techniques and Chapter 5 draws some analogies between SONAR/RADAR and ultrasonic nondestructive testing. This chapter also includes a discussion on some on the techniques used in signal processing in general. A special signal processing technique found useful is cross-correlation detection and this technique is treated in Chapter 6. Electronic digital compute.rs have made signal processing techniques easier to implement -Chapter 7 discusses the use of digital computers in ultrasonic NDT. Experimental equipment used to test cross-correlation detection of ultrasonic signals is described in Chapter 8. Chapter 9 summarises the conclusions drawn during this investigation.
Resumo:
E-business adoption rates in the agri-food sector are rather low, despite the fact that technical barriers have been mostly overcome during the last years and a large number of sophisticated offers are available. However, concerns about trust seem to impede the development of electronic relationships in the agri-food chains as trust is of particular importance in any exchange of agri-food products along the value chain. Drawing on existing research, characteristics and dimensions of trust are initially identified both in traditional and in electronic B2B relationships and a typology of trust is proposed. The aim of the paper is to provide an overview of the implementation and use of trust elements that e-commerce offers dedicated to agri-food sector. This assessment will show the current situation and discuss gaps for further improvement with the objective to facilitate the uptake of e-commerce in agri-food chains. © 2009 Elsevier B.V. All rights reserved.
Resumo:
We report the impact of cascaded reconfigurable optical add-drop multiplexer induced penalties on coherently-detected 28 Gbaud polarization multiplexed m-ary quadrature amplitude modulation (PM m-ary QAM) WDM channels. We investigate the interplay between different higher-order modulation channels and the effect of filter shapes and bandwidth of (de)multiplexers on the transmission performance, in a segment of pan-European optical network with a maximum optical path of 4,560 km (80km x 57 spans). We verify that if the link capacities are assigned assuming that digital back propagation is available, 25% of the network connections fail using electronic dispersion compensation alone. However, majority of such links can indeed be restored by employing single-channel digital back-propagation employing less than 15 steps for the whole link, facilitating practical application of DBP. We report that higher-order channels are most sensitive to nonlinear fiber impairments and filtering effects, however these formats are less prone to ROADM induced penalties due to the reduced maximum number of hops. Furthermore, it has been demonstrated that a minimum filter Gaussian order of 3 and bandwidth of 35 GHz enable negligible excess penalty for any modulation order.
Resumo:
Electronic channel affiliates are important online intermediaries between customers and host retailers. However, no work has studied how online retailers control online intermediaries. By conducting an exploratory content analysis of 85 online contracts between online retailers and their online intermediaries, and categorizing the governing mechanisms used, insights into the unique aspects of the control of online intermediaries are presented. Findings regarding incentives, monitoring, and enforcement are presented. Additionally, testable research propositions are presented to guide further theory development, drawing on contract theory, resource dependence theory and agency theory. Managerial implications are discussed. © 2012 Elsevier Inc.
Resumo:
The technology of record, storage and processing of the texts, based on creation of integer index cycles is discussed. Algorithms of exact-match search and search similar on the basis of inquiry in a natural language are considered. The software realizing offered approaches is described, and examples of the electronic archives possessing properties of intellectual search are resulted.
Resumo:
The paper describes education complex "Multi-agent Technologies for Parallel and Distributed Information Processing in Telecommunication Networks".
Resumo:
All-optical signal processing is a powerful tool for the processing of communication signals and optical network applications have been routinely considered since the inception of optical communication. There are many successful optical devices deployed in today’s communication networks, including optical amplification, dispersion compensation, optical cross connects and reconfigurable add drop multiplexers. However, despite record breaking performance, all-optical signal processing devices have struggled to find a viable market niche. This has been mainly due to competition from electro-optic alternatives, either from detailed performance analysis or more usually due to the limited market opportunity for a mid-link device. For example a wavelength converter would compete with a reconfigured transponder which has an additional market as an actual transponder enabling significantly more economical development. Never-the-less, the potential performance of all-optical devices is enticing. Motivated by their prospects of eventual deployment, in this chapter we analyse the performance and energy consumption of digital coherent transponders, linear coherent repeaters and modulator based pulse shaping/frequency conversion, setting a benchmark for the proposed all-optical implementations.
Resumo:
The paper informs about the history of manuscript digitization in the National Library of the Czech Republic as well as about other issues concerning processing of manuscripts. The main consequence of the massive digitization and record and/or full text processing is a paradigm shift leading to the digital history.
Resumo:
The purpose of this paper is to investigate the technological development of electronic inventory solutions from perspective of patent analysis. We first applied the international patent classification to classify the top categories of data processing technologies and their corresponding top patenting countries. Then we identified the core technologies by the calculation of patent citation strength and standard deviation criterion for each patent. To eliminate those core innovations having no reference relationships with the other core patents, relevance strengths between core technologies were evaluated also. Our findings provide market intelligence not only for the research and development community, but for the decision making of advanced inventory solutions.
Resumo:
ACM Computing Classification System (1998): D.2.11, D.1.3, D.3.1, J.3, C.2.4.
Resumo:
This exploratory study of a classroom with mentoring and neutral e-mail was conducted in a public commuter state university in South Florida between January 1996 and April 1996. Sixteen males and 83 females from four graduate level educational research classes participated in the study.^ Two main hypotheses were tested. Hypothesis One was that those students receiving mentoring e-mail messages would score significantly higher on an instrument measuring attitude toward educational research (ATERS) than those not receiving mentoring e-mail messages. Hypothesis Two was that those students receiving mentoring e-mail would score significantly higher on objective exams covering the educational research material than those not receiving mentoring e-mail.^ Results of factorial analyses of variance showed no significant differences between the treatment groups in achievement or in attitudes toward educational research. Introverts had lower attitudes and lower final exam grades in both groups, although introverts in the mentored group scored higher than those introverts in the neutral group.^ A t test of the means of total response to e-mail from the researcher showed a significant difference between the mentored and neutral e-mail groups. Introverts responded more often than extraverts in both groups.^ Teacher effect was significant in determining class response to e-mail messages. Responses were most frequent in the researcher's classes.^ Qualitative analyses of the e-mail and course evaluation survey and of the content of e-mail messages received by the researcher were then grouped into basic themes and discussed.^ A qualitative analysis of an e-mail and course evaluation survey revealed that students from both the neutral and mentoring e-mail groups appreciated teacher feedback. A qualitative analysis of the mentoring and neutral e-mail replies divided the responses into those pertaining to the class, such as test and research paper questions, and more personal items, such as problems in the class and personal happenings.^ At this point in time, e-mail is not a standard way of communicating in classes in the college of education at this university. As this technology tool of communication becomes more popular, it is anticipated that replications of this study will be warranted. ^
Resumo:
The primary purpose of this investigation is to study the motives of community college faculty who decide not to use computers in teaching. In spite of the fact that many of the environmental blocks that would otherwise inhibit the use of the computers have been eliminated at many institutions, many faculty do not use a computer beyond its word-processing function. For the purpose of the study non-adoption of computers in teaching is defined as not using computers for more than word-processing. ^ The issues in the literature focus on resistance and assume a pro-innovation and pro-adoption bias. Previous research on the questions is primarily surveys with narrowly focused assumptions. This qualitative research directly asks the participants about their feelings, beliefs, attitudes, experiences, and behaviors in regard to computers in teaching. Through the interview process a number of other correlated issues emerge. ^ The investigation was conducted at Miami-Dade Community College, a large urban multicampus institution, in Miami-Dade, Florida. It was conducted through a series of in-depth phenomenological interviews. There were nine interviews; eight within the profile; two were pilots; and one was an extreme opposite of the profile. Each participant was interviewed three times for about 45 minutes. ^ The results indicate that the computer conflicts with the participants' values in regard to their teaching and their beliefs in regard to the nature of knowledge, learning, and the relationship that they wish to maintain with students. Computers require significant changes in the values, beliefs, and consequent behaviors. These are changes that the participants are not willing to make without overwhelming evidence that they are worth the sacrifice. For the participants, this worth is only definable as it positively improves learning. For even the experts the evidence is not there. Unlike the innovator, the high end computer user, these participants are not willing to adopt the computer on faith. ^