985 resultados para Robot soccer test bed
Resumo:
The use of wireless local area networks, called WLANs, as well as the proliferation of the use of multimedia applications have grown rapidly in recent years. Some factors affect the quality of service (QoS) received by the user and interference is one of them. This work presents strategies for planning and performance evaluation through an empirical study of the QoS parameters of a voice over Internet Protocol (VoIP) application in an interference network, as well as the relevance in the design of wireless networks to determine the coverage area of an access point, taking into account several parameters such as power, jitter, packet loss, delay, and PMOS. Another strategy is based on a hybrid approach that considers measuring and Bayesian inference applied to wireless networks, taking into consideration QoS parameters. The models take into account a cross layer vision of networks, correlating aspects of the physical environment, on the signal propagation (power or distance) with aspects of VoIP applications (e.g., jitter and packet loss). Case studies were carried out for two indoor environments and two outdoor environments, one of them displaying main characteristics of the Amazon region (e.g., densely arboreous environments). This last test bed was carried out in a real system because the Government of the State of Pará has a digital inclusion program called NAVEGAPARÁ.
Resumo:
This paper presents a multi-agent system for real-time operation of simulated microgrid using the Smart-Grid Test Bed at Washington State University. The multi-agent system (MAS) was developed in JADE (Java Agent DEvelopment Framework) which is a Foundation for Intelligent Physical Agents (FIPA) compliant open source multi-agent platform. The proposed operational strategy is mainly focused on using an appropriate energy management and control strategies to improve the operation of an islanded microgrid, formed by photovoltaic (PV) solar energy, batteries and resistive and rotating machines loads. The focus is on resource management and to avoid impact on loads from abrupt variations or interruption that changes the operating conditions. The management and control of the PV system is performed in JADE, while the microgrid model is simulated in RSCAD/RTDS (Real-Time Digital Simulator). Finally, the outcome of simulation studies demonstrated the feasibility of the proposed multi-agent approach for real-time operation of a microgrid.
Resumo:
The strain image contrast of some in vivo breast lesions changes with increasing applied load. This change is attributed to differences in the nonlinear elastic properties of the constituent tissues suggesting some potential to help classify breast diseases by their nonlinear elastic properties. A phantom with inclusions and long-term stability is desired to serve as a test bed for nonlinear elasticity imaging method development, testing, etc. This study reports a phantom designed to investigate nonlinear elastic properties with ultrasound elastographic techniques. The phantom contains four spherical inclusions and was manufactured from a mixture of gelatin, agar and oil. The phantom background and each of the inclusions have distinct Young's modulus and nonlinear mechanical behavior. This phantom was subjected to large deformations (up to 20%) while scanning with ultrasound, and changes in strain image contrast and contrast-to-noise ratio between inclusion and background, as a function of applied deformation, were investigated. The changes in contrast over a large deformation range predicted by the finite element analysis (FEA) were consistent with those experimentally observed. Therefore, the paper reports a procedure for making phantoms with predictable nonlinear behavior, based on independent measurements of the constituent materials, and shows that the resulting strain images (e. g., strain contrast) agree with that predicted with nonlinear FEA.
Resumo:
[EN]An accurate estimation of the number of people entering / leaving a controlled area is an interesting capability for automatic surveil- lance systems. Potential applications where this technology can be ap- plied include those related to security, safety, energy saving or fraud control. In this paper we present a novel con guration of a multi-sensor system combining both visual and range data specially suited for trou- blesome scenarios such as public transportation. The approach applies probabilistic estimation lters on raw sensor data to create intermediate level hypothesis that are later fused using a certainty-based integration stage. Promising results have been obtained in several tests performed on a realistic test bed scenario under variable lightning conditions.
Resumo:
The Italian radio telescopes currently undergo a major upgrade period in response to the growing demand for deep radio observations, such as surveys on large sky areas or observations of vast samples of compact radio sources. The optimised employment of the Italian antennas, at first constructed mainly for VLBI activities and provided with a control system (FS – Field System) not tailored to single-dish observations, required important modifications in particular of the guiding software and data acquisition system. The production of a completely new control system called ESCS (Enhanced Single-dish Control System) for the Medicina dish started in 2007, in synergy with the software development for the forthcoming Sardinia Radio Telescope (SRT). The aim is to produce a system optimised for single-dish observations in continuum, spectrometry and polarimetry. ESCS is also planned to be installed at the Noto site. A substantial part of this thesis work consisted in designing and developing subsystems within ESCS, in order to provide this software with tools to carry out large maps, spanning from the implementation of On-The-Fly fast scans (following both conventional and innovative observing strategies) to the production of single-dish standard output files and the realisation of tools for the quick-look of the acquired data. The test period coincided with the commissioning phase for two devices temporarily installed – while waiting for the SRT to be completed – on the Medicina antenna: a 18-26 GHz 7-feed receiver and the 14-channel analogue backend developed for its use. It is worth stressing that it is the only K-band multi-feed receiver at present available worldwide. The commissioning of the overall hardware/software system constituted a considerable section of the thesis work. Tests were led in order to verify the system stability and its capabilities, down to sensitivity levels which had never been reached in Medicina using the previous observing techniques and hardware devices. The aim was also to assess the scientific potential of the multi-feed receiver for the production of wide maps, exploiting its temporary availability on a mid-sized antenna. Dishes like the 32-m antennas at Medicina and Noto, in fact, offer the best conditions for large-area surveys, especially at high frequencies, as they provide a suited compromise between sufficiently large beam sizes to cover quickly large areas of the sky (typical of small-sized telescopes) and sensitivity (typical of large-sized telescopes). The KNoWS (K-band Northern Wide Survey) project is aimed at the realisation of a full-northern-sky survey at 21 GHz; its pilot observations, performed using the new ESCS tools and a peculiar observing strategy, constituted an ideal test-bed for ESCS itself and for the multi-feed/backend system. The KNoWS group, which I am part of, supported the commissioning activities also providing map-making and source-extraction tools, in order to complete the necessary data reduction pipeline and assess the general system scientific capabilities. The K-band observations, which were carried out in several sessions along the December 2008-March 2010 period, were accompanied by the realisation of a 5 GHz test survey during the summertime, which is not suitable for high-frequency observations. This activity was conceived in order to check the new analogue backend separately from the multi-feed receiver, and to simultaneously produce original scientific data (the 6-cm Medicina Survey, 6MS, a polar cap survey to complete PMN-GB6 and provide an all-sky coverage at 5 GHz).
Resumo:
The two Mars Exploration Rovers (MER), Spirit and Opportunity, landed on the Martian surface in January 2004 and have since collected a wealth of information about their landing sites. As part of their payload, the miniaturised Mössbauer spectrometer MIMOS II contributes to the success of the mission by identifying Iron-bearing minerals and by determining Iron oxidation states in them. The basis of this work is the data set obtained at Opportunity’s landing site at Meridiani Planum. A portion of this data set is evaluated with different methods, with the aim to thoroughly characterize lithologic components at Meridiani Planum and possible relations between them.rnMIMOS II is able to measure Mössbauer spectra at different energies simultaneously, bearing information from different sampling depths of the investigated target. The ability of depth-selective Mössbauer spectroscopy to characterize weathered surface layers is illustrated through its application to two suitable rock targets that were investigated on Mars. In both cases, an enhanced concentration of Iron oxides at the rock surface was detected, pointing to a low degree of aqueous alteration. rnThe mineral hematite (α-Fe2O3) is present in the matrix of outcrop rocks and in spherules weathering from the outcrop. Simultaneous fitting of Mössbauer spectra was applied to data sets obtained on both target types to characterize the hematite component in detail. This approach reveals that two hematite populations are present, both in the outcrop matrix as well as in spherules. The hematite component with a comparably high degree of crystallinity and/or chemical purity is present in the outcrop matrix. The investigation of hematite at Meridiani Planum has shown that simultaneous fitting is a suitable and useful method to evaluate a large, correlated set of Mössbauer spectra.rnOpportunity encountered loose, cm-sized rocks along its traverse. Based on their composition and texture, these “cobbles” can be divided into three different groups. Outcrop fragments are impact-derived ejecta from local outcrop rocks. Cobbles of meteoritic origin contain the minerals kamacite (Fe,Ni) and troilite (FeS) and exhibit high Ni contents. Melt-bearing impact breccias bear similarities to local outcrop rocks and basaltic soil, with a phase composition and texture consistent with a formation scenario involving partial melting and inclusion of small, bright outcrop clasts. rnIron meteorites on the Martian surface experience weathering through the presence of even trace amounts of water due to their metallic nature. Opportunity encountered and investigated four Iron meteorites, which exhibit evidence for physical and chemical weathering. Discontinuous coatings contain Iron oxides, pointing to the influence of limited amounts of water. rnA terrestrial analogue site for Meridiani Planum is the Rio Tinto basin in south-west Spain. With its deposits of sulfate- and iron-oxide-bearing minerals, the region provides an adequate test bed for instrumentation for future Mars missions. In-situ investigations at Rio Tinto were carried out with a special focus on the combined use of Mössbauer spectroscopy with MIMOS II and Raman spectroscopy with a field-portable instrument. The results demonstrate that the two instruments provide complementary information about investigated samples.
Resumo:
In this report a new automated optical test for next generation of photonic integrated circuits (PICs) is provided by the test-bed design and assessment. After a briefly analysis of critical problems of actual optical tests, the main test features are defined: automation and flexibility, relaxed alignment procedure, speed up of entire test and data reliability. After studying varied solutions, the test-bed components are defined to be lens array, photo-detector array, and software controller. Each device is studied and calibrated, the spatial resolution, and reliability against interference at the photo-detector array are studied. The software is programmed in order to manage both PIC input, and photo-detector array output as well as data analysis. The test is validated by analysing state-of-art 16 ports PIC: the waveguide location, current versus power, and time-spatial power distribution are measured as well as the optical continuity of an entire path of PIC. Complexity, alignment tolerance, time of measurement are also discussed.
Resumo:
Mr. Kubon's project was inspired by the growing need for an automatic, syntactic analyser (parser) of Czech, which could be used in the syntactic processing of large amounts of texts. Mr. Kubon notes that such a tool would be very useful, especially in the field of corpus linguistics, where creating a large-scale "tree bank" (a collection of syntactic representations of natural language sentences) is a very important step towards the investigation of the properties of a given language. The work involved in syntactically parsing a whole corpus in order to get a representative set of syntactic structures would be almost inconceivable without the help of some kind of robust (semi)automatic parser. The need for the automatic natural language parser to be robust increases with the size of the linguistic data in the corpus or in any other kind of text which is going to be parsed. Practical experience shows that apart from syntactically correct sentences, there are many sentences which contain a "real" grammatical error. These sentences may be corrected in small-scale texts, but not generally in the whole corpus. In order to be able to complete the overall project, it was necessary to address a number of smaller problems. These were; 1. the adaptation of a suitable formalism able to describe the formal grammar of the system; 2. the definition of the structure of the system's dictionary containing all relevant lexico-syntactic information, and the development of a formal grammar able to robustly parse Czech sentences from the test suite; 3. filling the syntactic dictionary with sample data allowing the system to be tested and debugged during its development (about 1000 words); 4. the development of a set of sample sentences containing a reasonable amount of grammatical and ungrammatical phenomena covering some of the most typical syntactic constructions being used in Czech. Number 3, building a formal grammar, was the main task of the project. The grammar is of course far from complete (Mr. Kubon notes that it is debatable whether any formal grammar describing a natural language may ever be complete), but it covers the most frequent syntactic phenomena, allowing for the representation of a syntactic structure of simple clauses and also the structure of certain types of complex sentences. The stress was not so much on building a wide coverage grammar, but on the description and demonstration of a method. This method uses a similar approach as that of grammar-based grammar checking. The problem of reconstructing the "correct" form of the syntactic representation of a sentence is closely related to the problem of localisation and identification of syntactic errors. Without a precise knowledge of the nature and location of syntactic errors it is not possible to build a reliable estimation of a "correct" syntactic tree. The incremental way of building the grammar used in this project is also an important methodological issue. Experience from previous projects showed that building a grammar by creating a huge block of metarules is more complicated than the incremental method, which begins with the metarules covering most common syntactic phenomena first, and adds less important ones later, especially from the point of view of testing and debugging the grammar. The sample of the syntactic dictionary containing lexico-syntactical information (task 4) now has slightly more than 1000 lexical items representing all classes of words. During the creation of the dictionary it turned out that the task of assigning complete and correct lexico-syntactic information to verbs is a very complicated and time-consuming process which would itself be worth a separate project. The final task undertaken in this project was the development of a method allowing effective testing and debugging of the grammar during the process of its development. The problem of the consistency of new and modified rules of the formal grammar with the rules already existing is one of the crucial problems of every project aiming at the development of a large-scale formal grammar of a natural language. This method allows for the detection of any discrepancy or inconsistency of the grammar with respect to a test-bed of sentences containing all syntactic phenomena covered by the grammar. This is not only the first robust parser of Czech, but also one of the first robust parsers of a Slavic language. Since Slavic languages display a wide range of common features, it is reasonable to claim that this system may serve as a pattern for similar systems in other languages. To transfer the system into any other language it is only necessary to revise the grammar and to change the data contained in the dictionary (but not necessarily the structure of primary lexico-syntactic information). The formalism and methods used in this project can be used in other Slavic languages without substantial changes.
Resumo:
Drug-induced respiratory depression is a common side effect of the agents used in anesthesia practice to provide analgesia and sedation. Depression of the ventilatory drive in the spontaneously breathing patient can lead to severe cardiorespiratory events and it is considered a primary cause of morbidity. Reliable predictions of respiratory inhibition in the clinical setting would therefore provide a valuable means to improve the safety of drug delivery. Although multiple studies investigated the regulation of breathing in man both in the presence and absence of ventilatory depressant drugs, a unified description of respiratory pharmacodynamics is not available. This study proposes a mathematical model of human metabolism and cardiorespiratory regulation integrating several isolated physiological and pharmacological aspects of acute drug-induced ventilatory depression into a single theoretical framework. The description of respiratory regulation has a parsimonious yet comprehensive structure with substantial predictive capability. Simulations relative to the synergistic interaction of the hypercarbic and hypoxic respiratory drive and the global effect of drugs on the control of breathing are in good agreement with published experimental data. Besides providing clinically relevant predictions of respiratory depression, the model can also serve as a test bed to investigate issues of drug tolerability and dose finding/control under non-steady-state conditions.
Resumo:
With energy demands and costs growing every day, the need for improving energy efficiency in electrical devices has become very important. Research into various methods of improving efficiency for all electrical components will be a key to meet future energy needs. This report documents the design, construction, and testing of a research quality electric machine dynamometer and test bed. This test cell system can be used for research in several areas including: electric drives systems, electric vehicle propulsion systems, power electronic converters, load/source element in an AC Microgrid, as well as many others. The test cell design criteria, and decisions, will be discussed in reference to user functionality and flexibility. The individual power components will be discussed in detail to how they relate to the project, highlighting any feature used in operation of the test cell. A project timeline will be discussed, clearly stating the work done by the different individuals involved in the project. In addition, the system will be parameterized and benchmark data will be used to provide the functional operation of the system. With energy demands and costs growing every day, the need for improving energy efficiency in electrical devices has become very important. Research into various methods of improving efficiency for all electrical components will be a key to meet future energy needs. This report documents the design, construction, and testing of a research quality electric machine dynamometer and test bed. This test cell system can be used for research in several areas including: electric drives systems, electric vehicle propulsion systems, power electronic converters, load/source element in an AC Microgrid, as well as many others. The test cell design criteria, and decisions, will be discussed in reference to user functionality and flexibility. The individual power components will be discussed in detail to how they relate to the project, highlighting any feature used in operation of the test cell. A project timeline will be discussed, clearly stating the work done by the different individuals involved in the project. In addition, the system will be parameterized and benchmark data will be used to provide the functional operation of the system.
Resumo:
This dissertation serves as a call to geoscientists to share responsibility with K-12 educators for increasing Earth science literacy. When partnerships are created among K-12 educators and geoscientists, the synergy created can promote Earth science literacy in students, teachers, and the broader community. The research described here resulted in development of tools that can support effective professional development for teachers. One tool is used during the planning stages to structure a professional development program, another set of tools supports measurement of the effectiveness of a development program, and the third tool supports sustainability of professional development programs. The Michigan Teacher Excellence Program (MiTEP), a Math/Science Partnership project funded by the National Science Foundation, served as the test bed for developing and testing these tools. The first tool, the planning tool, is the Earth Science Literacy Principles (ESLP). The ESLP served as a planning tool for the two-week summer field courses as part of the MiTEP program. The ESLP, published in 2009, clearly describe what an Earth science literate person should know. The ESLP consists of nine big ideas and their supporting fundamental concepts. Using the ESLP for planning a professional development program assisted both instructors and teacher-participants focus on important concepts throughout the professional development activity. The measurement tools were developed to measure change in teachers’ Earth science content-area knowledge and perceptions related to teaching and learning that result from participating in a professional development program. The first measurement tool, the Earth System Concept Inventory (ESCI), directly measures content-area knowledge through a succession of multiple-choice questions that are aligned with the content of the professional development experience. The second measurement, an exit survey, collects qualitative data from teachers regarding their impression of the professional development. Both the ESCI and the exit survey were tested for validity and reliability. Lesson study is discussed here as a strategy for sustaining professional development in a school or a district after the end of a professional development activity. Lesson study, as described here, was offered as a formal course. Teachers engaged in lesson study worked collaboratively to design and test lessons that improve the teachers’ classroom practices. Data regarding the impact of the lesson study activity were acquired through surveys, written documents, and group interviews. The data are interpreted to indicate that the lesson study process improved teacher quality and classroom practices. In the case described here, the lesson study process was adopted by the teachers’ district and currently serves as part of the district’s work in Professional Learning Communities, resulting in ongoing professional development throughout the district.
Resumo:
Wireless Mesh Networks (WMN) have proven to be a key technology for increased network coverage of Internet infrastructures. The development process for new protocols and architectures in the area of WMN is typically split into evaluation by network simulation and testing of a prototype in a test-bed. Testing a prototype in a real test-bed is time-consuming and expensive. Irrepressible external interferences can occur which makes debugging difficult. Moreover, the test-bed usually supports only a limited number of test topologies. Finally, mobility tests are impractical. Therefore, we propose VirtualMesh as a new testing architecture which can be used before going to a real test-bed. It provides instruments to test the real communication software including the network stack inside a controlled environment. VirtualMesh is implemented by capturing real traffic through a virtual interface at the mesh nodes. The traffic is then redirected to the network simulator OMNeT++. In our experiments, VirtualMesh has proven to be scalable and introduces moderate delays. Therefore, it is suitable for predeployment testing of communication software for WMNs.
Resumo:
ABSTRACT ONTOLOGIES AND METHODS FOR INTEROPERABILITY OF ENGINEERING ANALYSIS MODELS (EAMS) IN AN E-DESIGN ENVIRONMENT SEPTEMBER 2007 NEELIMA KANURI, B.S., BIRLA INSTITUTE OF TECHNOLOGY AND SCIENCES PILANI INDIA M.S., UNIVERSITY OF MASSACHUSETTS AMHERST Directed by: Professor Ian Grosse Interoperability is the ability of two or more systems to exchange and reuse information efficiently. This thesis presents new techniques for interoperating engineering tools using ontologies as the basis for representing, visualizing, reasoning about, and securely exchanging abstract engineering knowledge between software systems. The specific engineering domain that is the primary focus of this report is the modeling knowledge associated with the development of engineering analysis models (EAMs). This abstract modeling knowledge has been used to support integration of analysis and optimization tools in iSIGHT FD , a commercial engineering environment. ANSYS , a commercial FEA tool, has been wrapped as an analysis service available inside of iSIGHT-FD. Engineering analysis modeling (EAM) ontology has been developed and instantiated to form a knowledge base for representing analysis modeling knowledge. The instances of the knowledge base are the analysis models of real world applications. To illustrate how abstract modeling knowledge can be exploited for useful purposes, a cantilever I-Beam design optimization problem has been used as a test bed proof-of-concept application. Two distinct finite element models of the I-beam are available to analyze a given beam design- a beam-element finite element model with potentially lower accuracy but significantly reduced computational costs and a high fidelity, high cost, shell-element finite element model. The goal is to obtain an optimized I-beam design at minimum computational expense. An intelligent KB tool was developed and implemented in FiPER . This tool reasons about the modeling knowledge to intelligently shift between the beam and the shell element models during an optimization process to select the best analysis model for a given optimization design state. In addition to improved interoperability and design optimization, methods are developed and presented that demonstrate the ability to operate on ontological knowledge bases to perform important engineering tasks. One such method is the automatic technical report generation method which converts the modeling knowledge associated with an analysis model to a flat technical report. The second method is a secure knowledge sharing method which allocates permissions to portions of knowledge to control knowledge access and sharing. Both the methods acting together enable recipient specific fine grain controlled knowledge viewing and sharing in an engineering workflow integration environment, such as iSIGHT-FD. These methods together play a very efficient role in reducing the large scale inefficiencies existing in current product design and development cycles due to poor knowledge sharing and reuse between people and software engineering tools. This work is a significant advance in both understanding and application of integration of knowledge in a distributed engineering design framework.
Resumo:
There is a growing number of proxy-based reconstructions detailing the climatic changes that occurred during the last interglacial period (LIG). This period is of special interest, because large parts of the globe were characterized by a warmer-than-present-day climate, making this period an interesting test bed for climate models in light of projected global warming. However, mainly because synchronizing the different palaeoclimatic records is difficult, there is no consensus on a global picture of LIG temperature changes. Here we present the first model inter-comparison of transient simulations covering the LIG period. By comparing the different simulations, we aim at investigating the common signal in the LIG temperature evolution, investigating the main driving forces behind it and at listing the climate feedbacks which cause the most apparent inter-model differences. The model inter-comparison shows a robust Northern Hemisphere July temperature evolution characterized by a maximum between 130–125 ka BP with temperatures 0.3 to 5.3 K above present day. A Southern Hemisphere July temperature maximum, −1.3 to 2.5 K at around 128 ka BP, is only found when changes in the greenhouse gas concentrations are included. The robustness of simulated January temperatures is large in the Southern Hemisphere and the mid-latitudes of the Northern Hemisphere. For these regions maximum January temperature anomalies of respectively −1 to 1.2 K and −0.8 to 2.1 K are simulated for the period after 121 ka BP. In both hemispheres these temperature maxima are in line with the maximum in local summer insolation. In a number of specific regions, a common temperature evolution is not found amongst the models. We show that this is related to feedbacks within the climate system which largely determine the simulated LIG temperature evolution in these regions. Firstly, in the Arctic region, changes in the summer sea-ice cover control the evolution of LIG winter temperatures. Secondly, for the Atlantic region, the Southern Ocean and the North Pacific, possible changes in the characteristics of the Atlantic meridional overturning circulation are crucial. Thirdly, the presence of remnant continental ice from the preceding glacial has shown to be important when determining the timing of maximum LIG warmth in the Northern Hemisphere. Finally, the results reveal that changes in the monsoon regime exert a strong control on the evolution of LIG temperatures over parts of Africa and India. By listing these inter-model differences, we provide a starting point for future proxy-data studies and the sensitivity experiments needed to constrain the climate simulations and to further enhance our understanding of the temperature evolution of the LIG period.
Resumo:
This work applies higher order auxiliary excitation techniques to two types of quadrupole mass spectrometers (QMSs): commercial systems and spaceborne instruments. The operational settings of a circular rod geometry commercial system and an engineering test-bed for a hyperbolic rod geometry spaceborne instrument were matched, with the relative performance of each sensor characterized with and without applied excitation using isotopic measurements of Kr+. Each instrument was operated at the limit of the test electronics to determine the effect of auxiliary excitation on extending instrument capabilities. For the circular rod sensor, with applied excitation, a doubling of the mass resolution at 1% of peak transmission resulted from the elimination of the low-mass side peak tail typical of such rod geometries. The mass peak stability and ion rejection efficiency were also increased by factors of 2 and 10, respectively, with voltage scan lines passing through the center of stability islands formed from auxiliary excitation. Auxiliary excitation also resulted in factors of 6 and 2 in peak stability and ion rejection efficiency, respectively, for the hyperbolic rod sensor. These results not only have significant implications for the use of circular rod quadrupoles with applied excitation as a suitable replacement for traditional hyperbolic rod sensors, but also for extending the capabilities of existing hyperbolic rod QMSs for the next generation of spaceborne instruments and low-mass commercial systems.