997 resultados para Microfilm aperture card systems.
Resumo:
Mode of access: Internet.
Resumo:
Foliage Penetration (FOPEN) radar systems were introduced in 1960, and have been constantly improved by several organizations since that time. The use of Synthetic Aperture Radar (SAR) approaches for this application has important advantages, due to the need for high resolution in two dimensions. The design of this type of systems, however, includes some complications that are not present in standard SAR systems. FOPEN SAR systems need to operate with a low central frequency (VHF or UHF bands) in order to be able to penetrate the foliage. High bandwidth is also required to obtain high resolution. Due to the low central frequency, large integration angles are required during SAR image formation, and therefore the Range Migration Algorithm (RMA) is used. This project thesis identifies the three main complications that arise due to these requirements. First, a high fractional bandwidth makes narrowband propagation models no longer valid. Second, the VHF and UHF bands are used by many communications systems. The transmitted signal spectrum needs to be notched to avoid interfering them. Third, those communications systems cause Radio Frequency Interference (RFI) on the received signal. The thesis carries out a thorough analysis of the three problems, their degrading effects and possible solutions to compensate them. The UWB model is applied to the SAR signal, and the degradation induced by it is derived. The result is tested through simulation of both a single pulse stretch processor and the complete RMA image formation. Both methods show that the degradation is negligible, and therefore the UWB propagation effect does not need compensation. A technique is derived to design a notched transmitted signal. Then, its effect on the SAR image formation is evaluated analytically. It is shown that the stretch processor introduces a processing gain that reduces the degrading effects of the notches. The remaining degrading effect after processing gain is assessed through simulation, and an experimental graph of degradation as a function of percentage of nulled frequencies is obtained. The RFI is characterized and its effect on the SAR processor is derived. Once again, a processing gain is found to be introduced by the receiver. As the RFI power can be much higher than that of the desired signal, an algorithm is proposed to remove the RFI from the received signal before RMA processing. This algorithm is a modification of the Chirp Least Squares Algorithm (CLSA) explained in [4], which adapts it to deramped signals. The algorithm is derived analytically and then its performance is evaluated through simulation, showing that it is effective in removing the RFI and reducing the degradation caused by both RFI and notching. Finally, conclusions are drawn as to the importance of each one of the problems in SAR system design.
Resumo:
This is an online course pack consisting of Chaffey: Business Information Systems ISBN: 027365540X and access to a Pearson Education online course ISBN: 0273673491
Resumo:
This paper proposes a wireless EEG acquisition platform based on Open Multimedia Architecture Platform (OMAP) embedded system. A high-impedance active dry electrode was tested for improving the scalp- electrode interface. It was used the sigma-delta ADS1298 analog-to-digital converter, and developed a “kernelspace” character driver to manage the communications between the converter unit and the OMAP’s ARM core. The acquired EEG signal data is processed by a “userspace” application, which accesses the driver’s memory, saves the data to a SD-card and transmits them through a wireless TCP/IP-socket to a PC. The electrodes were tested through the alpha wave replacement phenomenon. The experimental results presented the expected alpha rhythm (8-13 Hz) reactiveness to the eyes opening task. The driver spends about 725 μs to acquire and store the data samples. The application takes about 244 μs to get the data from the driver and 1.4 ms to save it in the SD-card. A WiFi throughput of 12.8Mbps was measured which results in a transmission time of 5 ms for 512 kb of data. The embedded system consumes about 200 mAh when wireless off and 400 mAh when it is on. The system exhibits a reliable performance to record EEG signals and transmit them wirelessly. Besides the microcontroller-based architectures, the proposed platform demonstrates that powerful ARM processors running embedded operating systems can be programmed with real-time constrains at the kernel level in order to control hardware, while maintaining their parallel processing abilities in high level software applications.
Resumo:
This letter presents a comparison between threeFourier-based motion compensation (MoCo) algorithms forairborne synthetic aperture radar (SAR) systems. These algorithmscircumvent the limitations of conventional MoCo, namelythe assumption of a reference height and the beam-center approximation.All these approaches rely on the inherent time–frequencyrelation in SAR systems but exploit it differently, with the consequentdifferences in accuracy and computational burden. Aftera brief overview of the three approaches, the performance ofeach algorithm is analyzed with respect to azimuthal topographyaccommodation, angle accommodation, and maximum frequencyof track deviations with which the algorithm can cope. Also, ananalysis on the computational complexity is presented. Quantitativeresults are shown using real data acquired by the ExperimentalSAR system of the German Aerospace Center (DLR).
Resumo:
During the last decade the interest on space-borne Synthetic Aperture Radars (SAR) for remote sensing applications has grown as testified by the number of recent and forthcoming missions as TerraSAR-X, RADARSAT-2, COSMO-kyMed, TanDEM-X and the Spanish SEOSAR/PAZ. In this sense, this thesis proposes to study and analyze the performance of the state-of-the-Art space-borne SAR systems, with modes able to provide Moving Target Indication capabilities (MTI), i.e. moving object detection and estimation. The research will focus on the MTI processing techniques as well as the architecture and/ or configuration of the SAR instrument, setting the limitations of the current systems with MTI capabilities, and proposing efficient solutions for the future missions. Two European projects, to which the Universitat Politècnica de Catalunya provides support, are an excellent framework for the research activities suggested in this thesis. NEWA project proposes a potential European space-borne radar system with MTI capabilities in order to fulfill the upcoming European security policies. This thesis will critically review the state-of-the-Art MTI processing techniques as well as the readiness and maturity level of the developed capabilities. For each one of the techniques a performance analysis will be carried out based on the available technologies, deriving a roadmap and identifying the different technological gaps. In line with this study a simulator tool will be developed in order to validate and evaluate different MTI techniques in the basis of a flexible space-borne radar configuration. The calibration of a SAR system is mandatory for the accurate formation of the SAR images and turns to be critical in the advanced operation modes as MTI. In this sense, the SEOSAR/PAZ project proposes the study and estimation of the radiometric budget. This thesis will also focus on an exhaustive analysis of the radiometric budget considering the current calibration concepts and their possible limitations. In the framework of this project a key point will be the study of the Dual Receive Antenna (DRA) mode, which provides MTI capabilities to the mission. An additional aspect under study is the applicability of the Digital Beamforming on multichannel and/or multistatic radar platforms, which conform potential solutions for the NEWA project with the aim to fully exploit its capability jointly with MTI techniques.
Resumo:
In this bachelor's thesis a relay card for capacitance measurements was designed, built and tested. The study was made for the research and development laboratory of VTI Technologies, which manufactures capacitive silicon micro electro mechanical accelerometers and pressure sensors. As the size of the sensors is decreasing the capacitance value of the sensors also decreases. The decreased capacitance causes a need for new and more accurate measurement systems. The technology used in the instrument measuring the capacitance dictates a framework how the relay card should be designed, thus the operating principle of the instrument must be known. To achieve accurate results the measurement instrument and its functions needed to be used correctly. The relay card was designed using printed circuit board design methods that minimize interference coupling to the measurement. The relay card that was designed in this study is modular. It consists of a separate CPU card, which was used to control the add-on cards connected to it. The CPU card was controlled from a computer through a serial bus. Two add-on cards for the CPU card were designed in this study. The first one was the measurement card, which could be used to measure 32 capacitive sensors. The second add-on card was the MUX card, which could be used to switch between two measurement cards. The capacitance measurements carried out through the MUX card and the measurement cards were characterized with a series of test measurements. The test measurement data was then analysed. The relay card design was confirmed to work and offer accurate measurement results up to a measurement frequency of 10 MHz. The length of the measurement cables limited the measurement frequency.
Resumo:
Antennas play an important role in determining the characteristics of any electronic system which depends on free space as the propagation medium. Basically, an antenna can be considered as the connecting link between free space and the transmitter or receiver. For radar and navigational purposes the directional properties of an antenna is its most basic requirement as it determines the distribution of radiated energy. Hence the study of directional properties of antennas has got special significance and several useful applications.
Resumo:
Requirements analysis focuses on stakeholders concerns and their influence towards e-government systems. Some characteristics of stakeholders concerns clearly show the complexity and conflicts. This imposes a number of questions in the requirements analysis, such as how are they relevant to stakeholders? What are their needs? How conflicts among the different stakeholders can be resolved? And what coherent requirements can be methodologically produced? This paper describes the problem articulation method in organizational semiotics which can be used to conduct such complex requirements analysis. The outcomes of the analysis enable e-government systems development and management to meet userspsila needs. A case study of Yantai Citizen Card is chosen to illustrate a process of analysing stakeholders in the lifecycle of requirements analysis.
Resumo:
NGC 6908, an S0 galaxy situated in the direction of NGC 6907, was only recently recognized as a distinct galaxy, instead of only a part of NGC 6907. We present 21-cm radio synthesis observations obtained with the Giant Metrewave Radio Telescope (GMRT) and optical images and spectroscopy obtained with the Gemini-North telescope of this pair of interacting galaxies. From the radio observations, we obtained the velocity field and the H I column density map of the whole region containing the NGC 6907/8 pair, and by means of the Gemini multi-object spectroscopy we obtained high-quality photometric images and 5 angstrom resolution spectra sampling the two galaxies. By comparing the rotation curve of NGC 6907 obtained from the two opposite sides around the main kinematic axis, we were able to distinguish the normal rotational velocity field from the velocity components produced by the interaction between the two galaxies. Taking into account the rotational velocity of NGC 6907 and the velocity derived from the absorption lines for NGC 6908, we verified that the relative velocity between these systems is lower than 60 km s(-1). The emission lines observed in the direction of NGC 6908, not typical of S0 galaxies, have the same velocity expected for the NGC 6907 rotation curve. Some emission lines are superimposed on a broader absorption profile, which suggests that they were not formed in NGC 6908. Finally, the H I profile exhibits details of the interaction, showing three components: one for NGC 6908, another for the excited gas in the NGC 6907 disc and a last one for the gas with higher relative velocities left behind NGC 6908 by dynamical friction, used to estimate the time when the interaction started in (3.4 +/- 0.6) x 10(7) yr ago.
Resumo:
The demand for cooling and air-conditioning of building is increasingly ever growing. This increase is mostly due to population and economic growth in developing countries, and also desire for a higher quality of thermal comfort. Increase in the use of conventional cooling systems results in larger carbon footprint and more greenhouse gases considering their higher electricity consumption, and it occasionally creates peaks in electricity demand from power supply grid. Solar energy as a renewable energy source is an alternative to drive the cooling machines since the cooling load is generally high when solar radiation is high. This thesis examines the performance of PV/T solar collector manufactured by Solarus company in a solar cooling system for an office building in Dubai, New Delhi, Los Angeles and Cape Town. The study is carried out by analyzing climate data and the requirements for thermal comfort in office buildings. Cooling systems strongly depend on weather conditions and local climate. Cooling load of buildings depend on many parameters such as ambient temperature, indoor comfort temperature, solar gain to the building and internal gains including; number of occupant and electrical devices. The simulations were carried out by selecting a suitable thermally driven chiller and modeling it with PV/T solar collector in Polysun software. Fractional primary energy saving and solar fraction were introduced as key figures of the project to evaluate the performance of cooling system. Several parametric studies and simulations were determined according to PV/T aperture area and hot water storage tank volume. The fractional primary energy saving analysis revealed that thermally driven chillers, particularly adsorption chillers are not suitable to be utilizing in small size of solar cooling systems in hot and tropic climates such as Dubai and New Delhi. Adsorption chillers require more thermal energy to meet the cooling load in hot and dry climates. The adsorption chillers operate in their full capacity and in higher coefficient of performance when they run in a moderate climate since they can properly reject the exhaust heat. The simulation results also indicated that PV/T solar collector have higher efficiency in warmer climates, however it requires a larger size of PV/T collectors to supply the thermally driven chillers for providing cooling in hot climates. Therefore using an electrical chiller as backup gives much better results in terms of primary energy savings, since PV/T electrical production also can be used for backup electrical chiller in a net metering mechanism.
Resumo:
Although formal methods can dramatically increase the quality of software systems, they have not widely been adopted in software industry. Many software companies have the perception that formal methods are not cost-effective cause they are plenty of mathematical symbols that are difficult for non-experts to assimilate. The Java Modelling Language (short for JML) Section 3.3 is an academic initiative towards the development of a common formal specification language for Java programs, and the implementation of tools to check program correctness. This master thesis work shows how JML based formal methods can be used to formally develop a privacy sensitive Java application. This is a smart card application for managing medical appointments. The application is named HealthCard. We follow the software development strategy introduced by João Pestana, presented in Section 3.4. Our work influenced the development of this strategy by providing hands-on insight on challenges related to development of a privacy sensitive application in Java. Pestana’s strategy is based on a three-step evolution strategy of software specifications, from informal ones, through semiformal ones, to JML formal specifications. We further prove that this strategy can be automated by implementing a tool that generates JML formal specifications from a welldefined subset of informal software specifications. Hence, our work proves that JML-based formal methods techniques are cost-effective, and that they can be made popular in software industry. Although formal methods are not popular in many software development companies, we endeavour to integrate formal methods to general software practices. We hope our work can contribute to a better acceptance of mathematical based formalisms and tools used by software engineers. The structure of this document is as follows. In Section 2, we describe the preliminaries of this thesis work. We make an introduction to the application for managing medical applications we have implemented. We also describe the technologies used in the development of the application. This section further illustrates the Java Card Remote Method Invocation communication model used in the medical application for the client and server applications. Section 3 introduces software correctness, including the design by contract and the concept of contract in JML. Section 4 presents the design structure of the application. Section 5 shows the implementation of the HealthCard. Section 6 describes how the HealthCard is verified and validated using JML formal methods tools. Section 7 includes some metrics of the HealthCard implementation and specification. Section 8 presents a short example of how a client-side of a smart card application can be implemented while respecting formal specifications. Section 9 describes a prototype tools to generate JML formal specifications from informal specifications automatically. Section 10 describes some challenges and main ideas came acrorss during the development of the HealthCard. The full formal specification and implementation of the HealthCard smart card application presented in this document can be reached at https://sourceforge.net/projects/healthcard/.
Resumo:
Electronic transactions are becoming increasingly commonplace in the countries of Latin America and the Caribbean, despite the collapse of many dotcom firms and the failure of e-commerce to make inroads in the region. In the transport sphere, the gradual incorporation of technology in support of processes and the exchange of money flows between players has brought greater versatility, security and flexibility. In public transport, such initiatives take the form of automatic ticket machines and prepaid card dispensing machines. In urban transit, electronic purses used for the supervision and payment of parking time, and in road pricing, electronic toll systems streamline the process of collecting money; this is especially the case with motorways and urban concessions. And in shipping, electronic transfers are increasingly being used for the payment of customs dues and port charges.In view of the importance of the topic and the interest expressed in it, the Transport Unit has begun a study of these issues, and recently published a paper entitled Sistemas de cobro electrónico de pasajes en el transporte público, ("Electronic systems for payment of tickets in public transport") LC/L.1752-P/E, July 2002, on which this issue of the Bulletin is based.
Resumo:
PURPOSE A beamlet based direct aperture optimization (DAO) for modulated electron radiotherapy (MERT) using photon multileaf collimator (pMLC) shaped electron fields is developed and investigated. METHODS The Swiss Monte Carlo Plan (SMCP) allows the calculation of dose distributions for pMLC shaped electron beams. SMCP is interfaced with the Eclipse TPS (Varian Medical Systems, Palo Alto, CA) which can thus be included into the inverse treatment planning process for MERT. This process starts with the import of a CT-scan into Eclipse, the contouring of the target and the organs at risk (OARs), and the choice of the initial electron beam directions. For each electron beam, the number of apertures, their energy, and initial shape are defined. Furthermore, the DAO requires dose-volume constraints for the structures contoured. In order to carry out the DAO efficiently, the initial electron beams are divided into a grid of beamlets. For each of those, the dose distribution is precalculated using a modified electron beam model, resulting in a dose list for each beamlet and energy. Then the DAO is carried out, leading to a set of optimal apertures and corresponding weights. These optimal apertures are now converted into pMLC shaped segments and the dose calculation for each segment is performed. For these dose distributions, a weight optimization process is launched in order to minimize the differences between the dose distribution using the optimal apertures and the pMLC segments. Finally, a deliverable dose distribution for the MERT plan is obtained and loaded back into Eclipse for evaluation. For an idealized water phantom geometry, a MERT treatment plan is created and compared to the plan obtained using a previously developed forward planning strategy. Further, MERT treatment plans for three clinical situations (breast, chest wall, and parotid metastasis of a squamous cell skin carcinoma) are created using the developed inverse planning strategy. The MERT plans are compared to clinical standard treatment plans using photon beams and the differences between the optimal and the deliverable dose distributions are determined. RESULTS For the idealized water phantom geometry, the inversely optimized MERT plan is able to obtain the same PTV coverage, but with an improved OAR sparing compared to the forwardly optimized plan. Regarding the right-sided breast case, the MERT plan is able to reduce the lung volume receiving more than 30% of the prescribed dose and the mean lung dose compared to the standard plan. However, the standard plan leads to a better homogeneity within the CTV. The results for the left-sided thorax wall are similar but also the dose to the heart is reduced comparing MERT to the standard treatment plan. For the parotid case, MERT leads to lower doses for almost all OARs but to a less homogeneous dose distribution for the PTV when compared to a standard plan. For all cases, the weight optimization successfully minimized the differences between the optimal and the deliverable dose distribution. CONCLUSIONS A beamlet based DAO using multiple beam angles is implemented and successfully tested for an idealized water phantom geometry and clinical situations.