988 resultados para Instrumentation and orchestration.
Resumo:
Letter to the editor relates to article Warwick, K. and Nasuto, S.J (2006). 'Historical and Current Machine Intelligence.' IEEE Instrumentation and Measurement Magazine 9 (6):20-26.
Resumo:
An updated analysis of observed stratospheric temperature variability and trends is presented on the basis of satellite, radiosonde, and lidar observations. Satellite data include measurements from the series of NOAA operational instruments, including the Microwave Sounding Unit covering 1979–2007 and the Stratospheric Sounding Unit (SSU) covering 1979–2005. Radiosonde results are compared for six different data sets, incorporating a variety of homogeneity adjustments to account for changes in instrumentation and observational practices. Temperature changes in the lower stratosphere show cooling of 0.5 K/decade over much of the globe for 1979–2007, with some differences in detail among the different radiosonde and satellite data sets. Substantially larger cooling trends are observed in the Antarctic lower stratosphere during spring and summer, in association with development of the Antarctic ozone hole. Trends in the lower stratosphere derived from radiosonde data are also analyzed for a longer record (back to 1958); trends for the presatellite era (1958–1978) have a large range among the different homogenized data sets, implying large trend uncertainties. Trends in the middle and upper stratosphere have been derived from updated SSU data, taking into account changes in the SSU weighting functions due to observed atmospheric CO2 increases. The results show mean cooling of 0.5–1.5 K/decade during 1979–2005, with the greatest cooling in the upper stratosphere near 40–50 km. Temperature anomalies throughout the stratosphere were relatively constant during the decade 1995–2005. Long records of lidar temperature measurements at a few locations show reasonable agreement with SSU trends, although sampling uncertainties are large in the localized lidar measurements. Updated estimates of the solar cycle influence on stratospheric temperatures show a statistically significant signal in the tropics (30N–S), with an amplitude (solar maximum minus solar minimum) of 0.5 K (lower stratosphere) to 1.0 K (upper stratosphere).
Resumo:
A synthesis method is outlined for the design of broadband anti-reflection coatings for use in spaceborne infrared optics. The Golden Section optimisation routine is used to make a search, using designated non-absorptive dielectric thin film combinations, for the coating design which fulfils the required spectral requirements using the least number of layers and different materials. Three examples are given of coatings designed by this method : (I) 1µm to 12µm anti-reflection coating on Zinc Sulphide using Zinc Sulphide and Yttrium Fluoride thin film materials. (ii) 2µm to 14µm anti-reflection coating on Germanium using Germanium and Ytterbium Fluoride thin film materials. (iii) 6µm to 17µm anti-reflection coating on Germanium using Lead Telluride, Zinc Selenide and Barium Fluoride. The measured spectral performance of the manufactured 6µm to 17µm coating on Germanium is given. This is the anti-reflection coating for the germanium optics in the NASA Cassini Orbiter CIRS instrument.
Resumo:
The High Resolution Dynamics Limb Sounder is described, with particular reference to the atmospheric measurements to be made and the rationale behind the measurement strategy. The demands this strategy places on the filters to be used in the instrument and the designs to which this leads to are described. A second set of filters at an intermediate image plane to reduce "Ghost Imaging" is discussed together with their required spectral properties. A method of combining the spectral characteristics of the primary and secondary filters in each channel are combined together with the spectral response of the detectors and other optical elements to obtain the system spectral response weighted appropriately for the Planck function and atmospheric limb absorption. This method is used to demonstrate whether the out-of-band spectral blocking requirement for a channel is being met and an example calculation is demonstrated showing how the blocking is built up for a representative channel. Finally, the techniques used to produce filters of the necessary sub-millimetre sizes together with the testing methods and procedures used to assess the environmental durability and establish space flight quality are discussed.
Resumo:
A deterministic prototype video deghoster is presented which is capable of calculating all the multipath channel distortion characteristics in one single pass and subsequently removing the multipath distortions, commonly termed ghosts. Within the system, a channel identification algorithm finds in isolation all the ghost components while a dedicated DSP filter subsystem is capable of removing ghosts in real time. The results from the system are presented.
Resumo:
Abstract. Not long after Franklin’s iconic studies, an atmospheric electric field was discovered in “fair weather” regions, well away from thunderstorms. The origin of the fair weather field was sought by Lord Kelvin, through development of electrostatic instrumentation and early data logging techniques, but was ultimately explained through the global circuit model of C.T.R. Wilson. In Wilson’s model, charge exchanged by disturbed weather electrifies the ionosphere, and returns via a small vertical current density in fair weather regions. New insights into the relevance of fair weather atmospheric electricity to terrestrial and planetary atmospheres are now emerging. For example, there is a possible role of the global circuit current density in atmospheric processes, such as cloud formation. Beyond natural atmospheric processes, a novel practical application is the use of early atmospheric electrostatic investigations to provide quantitative information on past urban air pollution.
Resumo:
A statistical technique for fault analysis in industrial printing is reported. The method specifically deals with binary data, for which the results of the production process fall into two categories, rejected or accepted. The method is referred to as logistic regression, and is capable of predicting future fault occurrences by the analysis of current measurements from machine parts sensors. Individual analysis of each type of fault can determine which parts of the plant have a significant influence on the occurrence of such faults; it is also possible to infer which measurable process parameters have no significant influence on the generation of these faults. Information derived from the analysis can be helpful in the operator's interpretation of the current state of the plant. Appropriate actions may then be taken to prevent potential faults from occurring. The algorithm is being implemented as part of an applied self-learning expert system.
Resumo:
This paper reports the results of a 2-year study of water quality in the River Enborne, a rural river in lowland England. Concentrations of nitrogen and phosphorus species and other chemical determinands were monitored both at high-frequency (hourly), using automated in situ instrumentation, and by manual weekly sampling and laboratory analysis. The catchment land use is largely agricultural, with a population density of 123 persons km−2. The river water is largely derived from calcareous groundwater, and there are high nitrogen and phosphorus concentrations. Agricultural fertiliser is the dominant source of annual loads of both nitrogen and phosphorus. However, the data show that sewage effluent discharges have a disproportionate effect on the river nitrogen and phosphorus dynamics. At least 38% of the catchment population use septic tank systems, but the effects are hard to quantify as only 6% are officially registered, and the characteristics of the others are unknown. Only 4% of the phosphorus input and 9% of the nitrogen input is exported from the catchment by the river, highlighting the importance of catchment process understanding in predicting nutrient concentrations. High-frequency monitoring will be a key to developing this vital process understanding.
Resumo:
This paper presents the use of a multiprocessor architecture for the performance improvement of tomographic image reconstruction. Image reconstruction in computed tomography (CT) is an intensive task for single-processor systems. We investigate the filtered image reconstruction suitability based on DSPs organized for parallel processing and its comparison with the Message Passing Interface (MPI) library. The experimental results show that the speedups observed for both platforms were increased in the same direction of the image resolution. In addition, the execution time to communication time ratios (Rt/Rc) as a function of the sample size have shown a narrow variation for the DSP platform in comparison with the MPI platform, which indicates its better performance for parallel image reconstruction.
Resumo:
Instrumentation and automation plays a vital role to managing the water industry. These systems generate vast amounts of data that must be effectively managed in order to enable intelligent decision making. Time series data management software, commonly known as data historians are used for collecting and managing real-time (time series) information. More advanced software solutions provide a data infrastructure or utility wide Operations Data Management System (ODMS) that stores, manages, calculates, displays, shares, and integrates data from multiple disparate automation and business systems that are used daily in water utilities. These ODMS solutions are proven and have the ability to manage data from smart water meters to the collaboration of data across third party corporations. This paper focuses on practical, utility successes in the water industry where utility managers are leveraging instantaneous access to data from proven, commercial off-the-shelf ODMS solutions to enable better real-time decision making. Successes include saving $650,000 / year in water loss control, safeguarding water quality, saving millions of dollars in energy management and asset management. Immediate opportunities exist to integrate the research being done in academia with these ODMS solutions in the field and to leverage these successes to utilities around the world.
Resumo:
Esse trabalho analisa como a competência dos trabalhadores de manutenção pode afetar a confiabilidade dos equipamentos de produção, num processo de terceirização da manutenção. Mais especificamente, tratando do caso específico da terceirização da execução de manutenção elétrica e de instrumentação do Lingotamento Contínuo da CST - Companhia Siderúrgica de Tubarão. Para tanto, foram analisados dois momentos históricos diferentes: antes e após a terceirização da execução de manutenção. Para as análises de competência utilizaram-se dois métodos de diagnóstico diferentes, sendo o primeiro deles, baseado nos estudos de comportamento propostos por David C. McClelland e compilados por Spencer & Spencer (1993). O segundo é uma proposta do próprio autor, que foi nominado locus de competência, baseado em práticas existentes e utilizando-se do que há de mais novo ligado à teoria de competência, nessa nova fase de racionalização do trabalho, tempo em que vivemos. A pesquisa foi explicativa, metodológica e aplicada. Foi realizada por meio de pesquisa de campo, sendo também bibliográfica, documental e por fim, participante. Os resultados da pesquisa mostraram que muito embora a competência das equipes de execução de manutenção elétrica e de instrumentação do Lingotamento Contínuo da CST tenha diminuído com a terceirização, isso não afetou a confiabilidade dos equipamentos de produção da unidade produtiva. Por fim, alguns pontos que poderiam explicar essa constatação são levantados ao final do trabalho.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)