854 resultados para General-purpose computing
Resumo:
This paper describes the design, implementation and testing of a high speed controlled stereo “head/eye” platform which facilitates the rapid redirection of gaze in response to visual input. It details the mechanical device, which is based around geared DC motors, and describes hardware aspects of the controller and vision system, which are implemented on a reconfigurable network of general purpose parallel processors. The servo-controller is described in detail and higher level gaze and vision constructs outlined. The paper gives performance figures gained both from mechanical tests on the platform alone, and from closed loop tests on the entire system using visual feedback from a feature detector.
Resumo:
During April and May 2010 the ash cloud from the eruption of the Icelandic volcano Eyjafjallajökull caused widespread disruption to aviation over northern Europe. The location and impact of the eruption led to a wealth of observations of the ash cloud were being obtained which can be used to assess modelling of the long range transport of ash in the troposphere. The UK FAAM (Facility for Airborne Atmospheric Measurements) BAe-146-301 research aircraft overflew the ash cloud on a number of days during May. The aircraft carries a downward looking lidar which detected the ash layer through the backscatter of the laser light. In this study ash concentrations derived from the lidar are compared with simulations of the ash cloud made with NAME (Numerical Atmospheric-dispersion Modelling Environment), a general purpose atmospheric transport and dispersion model. The simulated ash clouds are compared to the lidar data to determine how well NAME simulates the horizontal and vertical structure of the ash clouds. Comparison between the ash concentrations derived from the lidar and those from NAME is used to define the fraction of ash emitted in the eruption that is transported over long distances compared to the total emission of tephra. In making these comparisons possible position errors in the simulated ash clouds are identified and accounted for. The ash layers seen by the lidar considered in this study were thin, with typical depths of 550–750 m. The vertical structure of the ash cloud simulated by NAME was generally consistent with the observed ash layers, although the layers in the simulated ash clouds that are identified with observed ash layers are about twice the depth of the observed layers. The structure of the simulated ash clouds were sensitive to the profile of ash emissions that was assumed. In terms of horizontal and vertical structure the best results were obtained by assuming that the emission occurred at the top of the eruption plume, consistent with the observed structure of eruption plumes. However, early in the period when the intensity of the eruption was low, assuming that the emission of ash was uniform with height gives better guidance on the horizontal and vertical structure of the ash cloud. Comparison of the lidar concentrations with those from NAME show that 2–5% of the total mass erupted by the volcano remained in the ash cloud over the United Kingdom.
Resumo:
IEEE 754 floating-point arithmetic is widely used in modern, general-purpose computers. It is based on real arithmetic and is made total by adding both a positive and a negative infinity, a negative zero, and many Not-a-Number (NaN) states. Transreal arithmetic is total. It also has a positive and a negative infinity but no negative zero, and it has a single, unordered number, nullity. Modifying the IEEE arithmetic so that it uses transreal arithmetic has a number of advantages. It removes one redundant binade from IEEE floating-point objects, doubling the numerical precision of the arithmetic. It removes eight redundant, relational,floating-point operations and removes the redundant total order operation. It replaces the non-reflexive, floating-point, equality operator with a reflexive equality operator and it indicates that some of the exceptions may be removed as redundant { subject to issues of backward compatibility and transient future compatibility as programmers migrate to the transreal paradigm.
A benchmark-driven modelling approach for evaluating deployment choices on a multi-core architecture
Resumo:
The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.
Resumo:
Identifying the correct sense of a word in context is crucial for many tasks in natural language processing (machine translation is an example). State-of-the art methods for Word Sense Disambiguation (WSD) build models using hand-crafted features that usually capturing shallow linguistic information. Complex background knowledge, such as semantic relationships, are typically either not used, or used in specialised manner, due to the limitations of the feature-based modelling techniques used. On the other hand, empirical results from the use of Inductive Logic Programming (ILP) systems have repeatedly shown that they can use diverse sources of background knowledge when constructing models. In this paper, we investigate whether this ability of ILP systems could be used to improve the predictive accuracy of models for WSD. Specifically, we examine the use of a general-purpose ILP system as a method to construct a set of features using semantic, syntactic and lexical information. This feature-set is then used by a common modelling technique in the field (a support vector machine) to construct a classifier for predicting the sense of a word. In our investigation we examine one-shot and incremental approaches to feature-set construction applied to monolingual and bilingual WSD tasks. The monolingual tasks use 32 verbs and 85 verbs and nouns (in English) from the SENSEVAL-3 and SemEval-2007 benchmarks; while the bilingual WSD task consists of 7 highly ambiguous verbs in translating from English to Portuguese. The results are encouraging: the ILP-assisted models show substantial improvements over those that simply use shallow features. In addition, incremental feature-set construction appears to identify smaller and better sets of features. Taken together, the results suggest that the use of ILP with diverse sources of background knowledge provide a way for making substantial progress in the field of WSD.
Resumo:
The triple- and quadruple-escape peaks of 6.128 MeV photons from the (19)F(p,alpha gamma)(16)O nuclear reaction were observed in an HPGe detector. The experimental peak areas, measured in spectra projected with a restriction function that allows quantitative comparison of data from different multiplicities, are in reasonably good agreement with those predicted by Monte Carlo simulations done with the general-purpose radiation-transport code PENELOPE. The behaviour of the escape intensities was simulated for some gamma-ray energies and detector dimensions; the results obtained can be extended to other energies using an empirical function and statistical properties related to the phenomenon. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
A novel cryptography method based on the Lorenz`s attractor chaotic system is presented. The proposed algorithm is secure and fast, making it practical for general use. We introduce the chaotic operation mode, which provides an interaction among the password, message and a chaotic system. It ensures that the algorithm yields a secure codification, even if the nature of the chaotic system is known. The algorithm has been implemented in two versions: one sequential and slow and the other, parallel and fast. Our algorithm assures the integrity of the ciphertext (we know if it has been altered, which is not assured by traditional algorithms) and consequently its authenticity. Numerical experiments are presented, discussed and show the behavior of the method in terms of security and performance. The fast version of the algorithm has a performance comparable to AES, a popular cryptography program used commercially nowadays, but it is more secure, which makes it immediately suitable for general purpose cryptography applications. An internet page has been set up, which enables the readers to test the algorithm and also to try to break into the cipher.
Resumo:
This paper describes the first phase of a project attempting to construct an efficient general-purpose nonlinear optimizer using an augmented Lagrangian outer loop with a relative error criterion, and an inner loop employing a state-of-the art conjugate gradient solver. The outer loop can also employ double regularized proximal kernels, a fairly recent theoretical development that leads to fully smooth subproblems. We first enhance the existing theory to show that our approach is globally convergent in both the primal and dual spaces when applied to convex problems. We then present an extensive computational evaluation using the CUTE test set, showing that some aspects of our approach are promising, but some are not. These conclusions in turn lead to additional computational experiments suggesting where to next focus our theoretical and computational efforts.
Resumo:
Several agent platforms that implement the belief-desire-intention (BDI) architecture have been proposed. Even though most of them are implemented based on existing general purpose programming languages, e.g. Java, agents are either programmed in a new programming language or Domain-specific Language expressed in XML. As a consequence, this prevents the use of advanced features of the underlying programming language and the integration with existing libraries and frameworks, which are essential for the development of enterprise applications. Due to these limitations of BDI agent platforms, we have implemented the BDI4JADE, which is presented in this paper. It is implemented as a BDI layer on top of JADE, a well accepted agent platform.
Resumo:
The incessant search if nurse for qualify nursing care makes the Nursing Assistance Systematization, a current topic of discussion throughout the country, not only in order to comply the legal requirements of their practice, but especially by the expected benefits of its application. In this meaning, this research had a qualitative approach, developed for a way of research-action. The general purpose was to analyze the change in the nursing practices in a pediatric teaching hospital, based on construction and implementation of Nursing Assistance Systematization by the nursing team. The results had the thematic analysis of Paulo Freire and were shown in the form of reports. To achieve these purposes, it began by steps pre-trial, to review the charts of the institution and an approach with the managers. In the situational diagnosis of nursing practices without the systematization followed by applying a questionnaire with a nursing team and a focal group with nurses. These ways supported the implementation stage of the Nursing Assistance Systematization which developed actions associate such as focal group with the nurses about the nursing history, capacity with the nursing team about the Nursing Assistance Systematization, development, application and reworking of printed, and discussions in the small groups. The evaluations of the changes after the actions of the research occurred through individual interview with the nurses, to check the results. The charts review confirmed the deficit in the records performed by the nurse on the chart, which reinforced the need for implementation of Nursing Assistance Systematization, an argument used on the meeting with the managers, who promptly agree with the search. The questionnaire and the focal group with the nurses reveal a process of nursing work without systematization, showing gaps in practices, but also obtained relate of expectations of improvements in quality of care as of Nursing Assistance Systematization, furnishing data to the development of ways following-up. The prints were gradually used and modified as the team understood the Nursing Assistance Systematization and its purposes through capacity course. The final evaluation pointed to the partial implementation of the stages of Nursing Assistance Systematization had been institutionalized at the history and the development of nursing, beyond difficulties with diagnosis and prescription of nursing, in later representing a paradigm shift. This search collaborated to change the view about the Nursing Assistance Systematization by nursing team at the institution had been revealed through introduction of new practices in the process of nursing work, as examination of physical exam of the patient, the interview in the admission of customers on service and the daily monitoring by nursing through development of nursing. Before addition, it was noted which the purposes of this search were achieved, since were analyzed the changes in the nursing practices with the systematization. The research-action achieved proposes of the involvement of nursing team in changing their practices. This search contributed to the implementation of the Nursing Assistance Systematization in a pediatric teaching hospital and showed which is possible to seek resolution of problems when the objective is of the group and gave access for further searches within this theme
Resumo:
Qualitative study on the meaning of a child s birth to the father. Its general purpose was to comprehend the significance the man attaches to his child s birth and its specific objectives were to identify the man s feelings with regard to his child s birth as well as to verify his attitude toward a child s birth. The study was founded on the theoretical reference system about the man in the gravid-puerperal cycle and the humanization of the assistance. The data were obtained through semistructured interview performed with men accompanying their children s birth whose wives were in the immediate puerperium. This stage occurred in two maternity hospitals in Natal-RN, both of which adopt the principle of safe notherhood in the attendance of women in the process of parturition. The material apprehended from the statements was treated in conformity with the content analysis method in the mode of thematic analysis according to Bardin. Three thematic categories emerged from this process: the father s attitude toward his child s birth, the father s feelings in respect of his child s birth, and the informations received by the father in the course of his child s birth. The speech content was analyzed in accordance with the principles of symbolic interactionism according to Blumer. The results showed that the husbands interact with their respective wives and respond with attitudes of care, help, support, and encouragement within the principles of humanization intermingled with feelings of happiness, restlessness, and suffering leading them to appraise and exalt their consorts. Besides, we verified that the father s attitudes and feelings in the delivery room in the light of symbolic interactionism tend to be influenced by the interaction between him and the attending professionals
Resumo:
Fuzzy intelligent systems are present in a variety of equipment ranging from household appliances to Fuzzy intelligent systems are present in a variety of equipment ranging from household appliances to small devices such as digital cameras and cell phones being used primarily for dealing with the uncertainties in the modeling of real systems. However, commercial implementations of Fuzzy systems are not general purpose and do not have portability to different hardware platforms. Thinking about these issues this work presents the implementation of an open source development environment that consists of a desktop system capable of generate Graphically a general purpose Fuzzy controller and export these parameters for an embedded system with a Fuzzy controller written in Java Platform Micro Edition To (J2ME), whose modular design makes it portable to any mobile device that supports J2ME. Thus, the proposed development platform is capable of generating all the parameters of a Fuzzy controller and export it in XML file, and the code responsible for the control logic that is embedded in the mobile device is able to read this file and start the controller. All the parameters of a Fuzzy controller are configurable using the desktop system, since the membership functions and rule base, even the universe of discourse of the linguistic terms of output variables. This system generates Fuzzy controllers for the interpolation model of Takagi-Sugeno. As the validation process and testing of the proposed solution the Fuzzy controller was embedded on the mobile device Sun SPOT ® and used to control a plant-level Quanser®, and to compare the Fuzzy controller generated by the system with other types of controllers was implemented and embedded in sun spot a PID controller to control the same level plant of Quanser®
Resumo:
Official documents indicate to a curriculum organization that promotes the dialogue in different areas of knowledge. Among the proposals strategies are the "School Projects". This research appears from the staff need evidenced in the development of practice of the researcher in recent years as Pedagogical Advisor in high school. The comments made in the daily work on the kinds of projects and how they were developed in the school, generated concerns. They aroused the interest in further the discussion, aiming to reflect with teachers about the implementation of a pedagogic action on the use of educational projects in the classroom, as a didactic strategy which promotes the learning of students. In this sense, it seeks to develop studies and discussions by the application of questionnaires and the holding of a workshop with teachers in the area of Science of Nature and Mathematics in private high school institutions from Natal, searching opinions of them as the preparation and development of school projects. As general purpose, it aims to contribute with elements to the reflection of the teachers on the use of this strategy of education. For both, we propose: the knowledge of ideas/opinions of teachers on planning, development and evaluation of projects, both disciplinary and interdisciplinary, identifying the main difficulties of these teachers about the work with projects at school; reviewing projects developed at school after the press conference in a meeting with teachers, incorporating the identified aspects as weak points. In the course of the methodology research, questionnaires were used with open and closed questions for the lifting of preliminary ideas for teachers in order to subsidize the planning of a developed meeting later in the school itself on the subject in question. 10 teachers took part of the first step and 17 in the second one (pressconference). In the third stage, an individual interview was carried out and analysis of projects already developed. It is observed that, as the main difficulty for the development of projects in school, pointed to the time factor in the planning team, followed by excessive working hours for teachers that, generally, also work in other schools. Some teachers say they do not develop projects for not having knowledge of how to develop school projects, neither disciplinary, nor interdisciplinary
Resumo:
Non-audio signals have been recorded in the flash ROM memory of a portable MP3 player, in WAV format file, to examine the possibility of using these cheap and small instruments as general-purpose portable data loggers. A 1200-Hz FM carrier modulated by the non-audio signal has replaced the microphone signal, while using the REC operating mode of the MP3 player, which triggers the voice recording function. The signal recovery was carried out by a PLL-based FM demodulator whose input is the FM signal captured in the coil leads of the MP3 player's earphone. Sinusoidal and electrocardiogram signals have been used in the system evaluation. Although the quality of low frequency signals needs improvement, overall the results indicate the viability of the proposal. Suggestions are made for improvements and extensions of the work.
Resumo:
This work presents the development of an IEEE 1451.2 protocol controller based on a low-cost FPGA that is directly connected to the parallel port of a conventional personal computer. In this manner it is possible to implement a Network Capable Application Processor (NCAP) based on a personal computer, without parallel port modifications. This approach allows supporting the ten signal lines of the 10-wire IEEE 1451.2 Transducer Independent Interface (TII), that connects the network processor to the Smart Transducer Interface Module (STIM) also defined in the IEEE 1451.2 standard. The protocol controller is connected to the STIM through the TII's physical interface, enabling the portability of the application at the transducer and network processor level. The protocol controller architecture was fully developed in VHDL language and we have projected a special prototype configured in a general-purpose programmable logic device. We have implemented two versions of the protocol controller, which is based on IEEE 1451 standard, and we have obtained results using simulation and experimental tests. (c) 2008 Elsevier B.V. All rights reserved.