882 resultados para parallel processing systems


Relevância:

80.00% 80.00%

Publicador:

Resumo:

General-purpose parallel processing for solving day-to-day industrial problems has been slow to develop, partly because of the lack of suitable hardware from well-established, mainstream computer manufacturers and suitably parallelized application software. The parallelization of a CFD-(computational fluid dynamics) flow solution code is known as ESAUNA. This code is part of SAUNA, a large CFD suite aimed at computing the flow around very complex aircraft configurations including complete aircraft. A novel feature of the SAUNA suite is that it is designed to use either block-structured hexahedral grids, unstructured tetrahedral grids, or a hybrid combination of both grid types. ESAUNA is designed to solve the Euler equations or the Navier-Stokes equations, the latter in conjunction with various turbulence models. Two fundamental parallelization concepts are used—namely, grid partitioning and encapsulation of communications. Grid partitioning is applied to both block-structured grid modules and unstructured grid modules. ESAUNA can also be coupled with other simulation codes for multidisciplinary computations such as flow simulations around an aircraft coupled with flutter prediction for transient flight simulations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Multilevel algorithms are a successful class of optimisation techniques which address the mesh partitioning problem. They usually combine a graph contraction algorithm together with a local optimisation method which refines the partition at each graph level. To date these algorithms have been used almost exclusively to minimise the cut-edge weight, however it has been shown that for certain classes of solution algorithm, the convergence of the solver is strongly influenced by the subdomain aspect ratio. In this paper therefore, we modify the multilevel algorithms in order to optimise a cost function based on aspect ratio. Several variants of the algorithms are tested and shown to provide excellent results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper deals with the measure of Aspect Ratio for mesh partitioning and gives hints why, for certain solvers, the Aspect Ratio of partitions plays an important role. We define and rate different kinds of Aspect Ratio, present a new center-based partitioning method which optimizes this measure implicitly and rate several existing partitioning methods and tools under the criterion of Aspect Ratio.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose a novel analysis alternative, based on two Fourier Transforms for emotion recognition from speech -- Fourier analysis allows for display and synthesizes different signals, in terms of power spectral density distributions -- A spectrogram of the voice signal is obtained performing a short time Fourier Transform with Gaussian windows, this spectrogram portraits frequency related features, such as vocal tract resonances and quasi-periodic excitations during voiced sounds -- Emotions induce such characteristics in speech, which become apparent in spectrogram time-frequency distributions -- Later, the signal time-frequency representation from spectrogram is considered an image, and processed through a 2-dimensional Fourier Transform in order to perform the spatial Fourier analysis from it -- Finally features related with emotions in voiced speech are extracted and presented

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Computers employing some degree of data flow organisation are now well established as providing a possible vehicle for concurrent computation. Although data-driven computation frees the architecture from the constraints of the single program counter, processor and global memory, inherent in the classic von Neumann computer, there can still be problems with the unconstrained generation of fresh result tokens if a pure data flow approach is adopted. The advantages of allowing serial processing for those parts of a program which are inherently serial, and of permitting a demand-driven, as well as data-driven, mode of operation are identified and described. The MUSE machine described here is a structured architecture supporting both serial and parallel processing which allows the abstract structure of a program to be mapped onto the machine in a logical way.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Adobe's Acrobat software, released in June 1993, is based around a new Portable Document Format (PDF) which offers the possibility of being able to view and exchange electronic documents, independent of the originating software, across a wide variety of supported hardware platforms (PC, Macintosh, Sun UNIX etc.). The fact that Acrobat's imageable objects are rendered with full use of Level 2 PostScript means that the most demanding requirements can be met in terms of high-quality typography and device-independent colour. These qualities will be very desirable components in future multimedia and hypermedia systems. The current capabilities of Acrobat and PDF are described; in particular the presence of hypertext links, bookmarks, and yellow sticker annotations (in release 1.0) together with article threads and multi-media plugins in version 2.0, This article also describes the CAJUN project (CD-ROM Acrobat Journals Using Networks) which has been investigating the automated placement of PDF hypertextual features from various front-end text processing systems. CAJUN has also been experimenting with the dissemination of PDF over e-mail, via World Wide Web and on CDROM.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

For various reasons, many Algol 68 compilers do not directly implement the parallel processing operations defined in the Revised Algol 68 Report. It is still possible however, to perform parallel processing, multitasking and simulation provided that the implementation permits the creation of a master routine for the coordination and initiation of processes under its control. The package described here is intended for real time applications and runs in conjunction with the Algol 68R system; it extends and develops the original Algol 68RT package, which was designed for use with multiplexers at the Royal Radar Establishment, Malvern. The facilities provided, in addition to the synchronising operations, include an interface to an ICL Communications Processor enabling the abstract processes to be realised as the interaction of several teletypes or visual display units with a real time program providing a useful service.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Starting in December 1982 the University of Nottingham decided to phototypeset almost all of its examination papers `in house' using the troff, tbl and eqn programs running under UNIX. This tutorial lecture highlights the features of the three programs with particular reference to their strengths and weaknesses in a production environment. The following issues are particularly addressed: Standards -- all three software packages require the embedding of commands and the invocation of pre-written macros, rather than `what you see is what you get'. This can help to enforce standards, in the absence of traditional compositor skills. Hardware and Software -- the requirements are analysed for an inexpensive preview facility and a low-level interface to the phototypesetter. Mathematical and Technical papers -- the fine-tuning of eqn to impose a standard house style. Staff skills and training -- systems of this kind do not require the operators to have had previous experience of phototypesetting. Of much greater importance is willingness and flexibility in learning how to use computer systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The presence of non-linear loads at a point in the distribution system may deform voltage waveform due to the consumption of non-sinusoidal currents. The use of active power filters allows significant reduction of the harmonic content in the supply current. However, the processing of digital control structures for these filters may require high performance hardware, particularly for reference currents calculation. This work describes the development of hardware structures with high processing capability for application in active power filters. In this sense, it considers an architecture that allows parallel processing using programmable logic devices. The developed structure uses a hybrid model using a DSP and an FPGA. The DSP is used for the acquisition of current and voltage signals, calculation of fundamental current related controllers and PWM generation. The FPGA is used for intensive signal processing, such as the harmonic compensators. In this way, from the experimental analysis, significant reductions of the processing time are achieved when compared to traditional approaches using only DSP. The experimental results validate the designed structure and these results are compared with other ones from architectures reported in the literature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Elétrica, 2015.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This project examines the current available work on the explicit and implicit parallelization of the R scripting language and reports on experimental findings for the development of a model for predicting effective points for automatic parallelization to be performed, based upon input data sizes and function complexity. After finding or creating a series of custom benchmarks, an interval based on data size and time complexity where replacement becomes a viable option was found; specifically between O(N) and O(N3) exclusive. As data size increases, the benefits of parallel processing become more apparent and a point is reached where those benefits outweigh the costs in memory transfer time. Based on our observations, this point can be predicted with a fair amount of accuracy using regression on a sample of approximately ten data sizes spread evenly between a system determined minimum and maximum size.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The development cost of any civil infrastructure is very high; during its life span, the civil structure undergoes a lot of physical loads and environmental effects which damage the structure. Failing to identify this damage at an early stage may result in severe property loss and may become a potential threat to people and the environment. Thus, there is a need to develop effective damage detection techniques to ensure the safety and integrity of the structure. One of the Structural Health Monitoring methods to evaluate a structure is by using statistical analysis. In this study, a civil structure measuring 8 feet in length, 3 feet in diameter, embedded with thermocouple sensors at 4 different levels is analyzed under controlled and variable conditions. With the help of statistical analysis, possible damage to the structure was analyzed. The analysis could detect the structural defects at various levels of the structure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ensemble Stream Modeling and Data-cleaning are sensor information processing systems have different training and testing methods by which their goals are cross-validated. This research examines a mechanism, which seeks to extract novel patterns by generating ensembles from data. The main goal of label-less stream processing is to process the sensed events to eliminate the noises that are uncorrelated, and choose the most likely model without over fitting thus obtaining higher model confidence. Higher quality streams can be realized by combining many short streams into an ensemble which has the desired quality. The framework for the investigation is an existing data mining tool. First, to accommodate feature extraction such as a bush or natural forest-fire event we make an assumption of the burnt area (BA*), sensed ground truth as our target variable obtained from logs. Even though this is an obvious model choice the results are disappointing. The reasons for this are two: One, the histogram of fire activity is highly skewed. Two, the measured sensor parameters are highly correlated. Since using non descriptive features does not yield good results, we resort to temporal features. By doing so we carefully eliminate the averaging effects; the resulting histogram is more satisfactory and conceptual knowledge is learned from sensor streams. Second is the process of feature induction by cross-validating attributes with single or multi-target variables to minimize training error. We use F-measure score, which combines precision and accuracy to determine the false alarm rate of fire events. The multi-target data-cleaning trees use information purity of the target leaf-nodes to learn higher order features. A sensitive variance measure such as f-test is performed during each node’s split to select the best attribute. Ensemble stream model approach proved to improve when using complicated features with a simpler tree classifier. The ensemble framework for data-cleaning and the enhancements to quantify quality of fitness (30% spatial, 10% temporal, and 90% mobility reduction) of sensor led to the formation of streams for sensor-enabled applications. Which further motivates the novelty of stream quality labeling and its importance in solving vast amounts of real-time mobile streams generated today.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Kenia liegt in den Äquatorialtropen von Ostafrika und ist als ein weltweiter Hot-Spot für Aflatoxinbelastung insbesondere bei Mais bekannt. Diese toxischen und karzinogenen Verbindungen sind Stoffwechselprodukte von Pilzen und so insbesondere von der Wasseraktivität abhängig. Diese beeinflusst sowohl die Trocknung als auch die Lagerfähigkeit von Nahrungsmitteln und ist somit ein wichtiger Faktor bei der Entwicklung von energieeffizienten und qualitätsorientierten Verarbeitungsprozessen. Die vorliegende Arbeit hat sich zum Ziel gesetzt, die Veränderung der Wasseraktivität während der konvektiven Trocknung von Mais zu untersuchen. Mittels einer Optimierungssoftware (MS Excel Solver) wurde basierend auf sensorerfassten thermo-hygrometrischen Daten der gravimetrische Feuchteverlust von Maiskolben bei 37°C, 43°C und 53°C vorausberechnet. Dieser Bereich stellt den Übergang zwischen Niedrig- und Hochtemperaturtrocknung dar. Die Ergebnisse zeigen deutliche Unterschiede im Verhalten der Körner und der Spindel. Die Trocknung im Bereich von 35°C bis 45°C kombiniert mit hohen Strömungsgeschwindigkeiten (> 1,5 m / s) begünstigte die Trocknung der Körner gegenüber der Spindel und kann daher für eine energieeffiziente Trocknung von Kolben mit hohem Anfangsfeuchtegehalt empfohlen werden. Weitere Untersuchungen wurden zum Verhalten unterschiedlicher Schüttungen bei der bei Mais üblichen Satztrocknung durchgeführt. Entlieschter und gedroschener Mais führte zu einem vergrößerten Luftwiderstand in der Schüttung und sowohl zu einem höheren Energiebedarf als auch zu ungleichmäßigerer Trocknung, was nur durch einen erhöhten technischen Aufwand etwa durch Mischeinrichtungen oder Luftumkehr behoben werden könnte. Aufgrund des geringeren Aufwandes für die Belüftung und die Kontrolle kann für kleine landwirtschaftliche Praxisbetriebe in Kenia daher insbesondere die Trocknung ganzer Kolben in ungestörten Schüttungen empfohlen werden. Weiterhin wurde in der Arbeit die Entfeuchtung mittels eines Trockenmittels (Silikagel) kombiniert mit einer Heizquelle und abgegrenztem Luftvolumen untersucht und der konventionellen Trocknung gegenüber gestellt. Die Ergebnisse zeigten vergleichbare Entfeuchtungsraten während der ersten 5 Stunden der Trocknung. Der jeweilige Luftzustand bei Verwendung von Silikagel wurde insbesondere durch das eingeschlossene Luftvolumen und die Temperatur beeinflusst. Granulierte Trockenmittel sind bei der Maistrocknung unter hygienischen Gesichtspunkten vorteilhaft und können beispielsweise mit einfachen Öfen regeneriert werden, so dass Qualitätsbeeinträchtigungen wie bei Hochtemperatur- oder auch Freilufttrocknung vermieden werden können. Eine hochwertige Maistrocknungstechnik ist sehr kapitalintensiv. Aus der vorliegenden Arbeit kann aber abgeleitet werden, dass einfache Verbesserungen wie eine sensorgestützte Belüftung von Satztrocknern, der Einsatz von Trockenmitteln und eine angepasste Schüttungshöhe praktikable Lösungen für Kleinbauern in Kenia sein können. Hierzu besteht, ggf. auch zum Aspekt der Verwendung regenerativer Energien, weiterer Forschungsbedarf.