52 resultados para concave refractive microlens array
em CentAUR: Central Archive University of Reading - UK
Resumo:
Infrared filters and coatings have been employed on many sensing radiometer instruments to measure the thermal emission profiles and concentrations of certian chemical constituents found in planetary atmospheres. The High Resolution Dynamics Limb Sounder ( HIRDLS) is an example of the most recent developments in limb-viewing radiometry by employing a cooled focal plane detector array to provide simultaneous multi-channel monitoring of emission from gas and aerosols over an altitude range between 8 - 70 km. The use of spectrally selective cooled detectors in focal plane arrays has simplified the optical layout of radiometers, greatly reducing the number of components in the optical train. this has inevitably led to increased demands for the enviromnetal durability of the focal plane filters because of the need to cut sub-millimeter sizes, whilst maintaining an optimal spectral performance. Additionally the remaining refractive optical elements require antireflection coatings which must cover the entire spectral range of the focal plane array channels, in this case 6 to 18µm, with a minimum of reflection and absorption. This paper describes the optical layout and spectral design requirements for filteriong in the HIRDLS instrument, and reports progress on the manufacturing and testing of the sub-millimetre sized cooled filters. We also report on the spectral and environmental performance of prototype wideband antireflection coatings which satisfy the requirements above.
Resumo:
The acute hippocampal brain slice preparation is an important in vitro screening tool for potential anticonvulsants. Application of 4-aminopyridine (4-AP) or removal of external Mg2+ ions induces epileptiform bursting in slices which is analogous to electrical brain activity seen in status epilepticus states. We have developed these epileptiform models for use with multi-electrode arrays (MEAs), allowing recording across the hippocampal slice surface from 59 points. We present validation of this novel approach and analyses using two anticonvulsants, felbamate and phenobarbital, the effects of which have already been assessed in these models using conventional extracellular recordings. In addition to assessing drug effects on commonly described parameters (duration, amplitude and frequency), we describe novel methods using the MEA to assess burst propagation speeds and the underlying frequencies that contribute to the epileptiform activity seen. Contour plots are also used as a method of illustrating burst activity. Finally, we describe hitherto unreported properties of epileptiform bursting induced by 100M4-AP or removal of external Mg2+ ions. Specifically, we observed decreases over time in burst amplitude and increase over time in burst frequency in the absence of additional pharmacological interventions. These MEA methods enhance the depth, quality and range of data that can be derived from the hippocampal slice preparation compared to conventional extracellular recordings. It may also uncover additional modes of action that contribute to anti-epileptiform drug effects
Resumo:
The authors present a systolic design for a simple GA mechanism which provides high throughput and unidirectional pipelining by exploiting the inherent parallelism in the genetic operators. The design computes in O(N+G) time steps using O(N2) cells where N is the population size and G is the chromosome length. The area of the device is independent of the chromosome length and so can be easily scaled by replicating the arrays or by employing fine-grain migration. The array is generic in the sense that it does not rely on the fitness function and can be used as an accelerator for any GA application using uniform crossover between pairs of chromosomes. The design can also be used in hybrid systems as an add-on to complement existing designs and methods for fitness function acceleration and island-style population management
Resumo:
Much uncertainty in the value of the imaginary part of the refractive index of mineral dust contributes to uncertainty in the radiative effect of mineral dust in the atmosphere. A synthesis of optical, chemical and physical in-situ aircraft measurements from the DODO experiments during February and August 2006 are used to calculate the refractive index mineral dust encountered over West Africa. Radiative transfer modeling and measurements of broadband shortwave irradiance at a range of altitudes are used to test and validate these calculations for a specific dust event on 23 August 2006 over Mauritania. Two techniques are used to determine the refractive index: firstly a method combining measurements of scattering, absorption, size distributions and Mie code simulations, and secondly a method using composition measured on filter samples to apportion the content of internally mixed quartz, calcite and iron oxide-clay aggregates, where the iron oxide is represented by either hematite or goethite and clay by either illite or kaolinite. The imaginary part of the refractive index at 550 nm (ni550) is found to range between 0.0001 i to 0.0046 i, and where filter samples are available, agreement between methods is found depending on mineral combination assumed. The refractive indices are also found to agree well with AERONET data where comparisons are possible. ni550 is found to vary with dust source, which is investigated with the NAME model for each case. The relationship between both size distribution and ni550 on the accumulation mode single scattering albedo at 550 nm (ω0550) are examined and size distribution is found to have no correlation to ω0550, while ni550 shows a strong linear relationship with ω0550. Radiative transfer modeling was performed with different models (Mie-derived refractive indices, but also filter sampling composition assuming both internal and external mixing). Our calculations indicate that Mie-derived values of ni550 and the externally mixed dust where the iron oxide-clay aggregate corresponds to the goethite-kaolinite combination result in the best agreement with irradiance measurements. The radiative effect of the dust is found to be very sensitive to the mineral combination (and hence refractive index) assumed, and to whether the dust is assumed to be internally or externally mixed.
Resumo:
The paper presents a design for a hardware genetic algorithm which uses a pipeline of systolic arrays. These arrays have been designed using systolic synthesis techniques which involve expressing the algorithm as a set of uniform recurrence relations. The final design divorces the fitness function evaluation from the hardware and can process chromosomes of different lengths, giving the design a generic quality. The paper demonstrates the design methodology by progressively re-writing a simple genetic algorithm, expressed in C code, into a form from which systolic structures can be deduced. This paper extends previous work by introducing a simplification to a previous systolic design for the genetic algorithm. The simplification results in the removal of 2N 2 + 4N cells and reduces the time complexity by 3N + 1 cycles.
Resumo:
We advocate the use of systolic design techniques to create custom hardware for Custom Computing Machines. We have developed a hardware genetic algorithm based on systolic arrays to illustrate the feasibility of the approach. The architecture is independent of the lengths of chromosomes used and can be scaled in size to accommodate different population sizes. An FPGA prototype design can process 16 million genes per second.
Resumo:
We have designed a highly parallel design for a simple genetic algorithm using a pipeline of systolic arrays. The systolic design provides high throughput and unidirectional pipelining by exploiting the implicit parallelism in the genetic operators. The design is significant because, unlike other hardware genetic algorithms, it is independent of both the fitness function and the particular chromosome length used in a problem. We have designed and simulated a version of the mutation array using Xilinix FPGA tools to investigate the feasibility of hardware implementation. A simple 5-chromosome mutation array occupies 195 CLBs and is capable of performing more than one million mutations per second. I. Introduction Genetic algorithms (GAs) are established search and optimization techniques which have been applied to a range of engineering and applied problems with considerable success [1]. They operate by maintaining a population of trial solutions encoded, using a suitable encoding scheme.
Resumo:
Fourier transform infrared (FTIR) spectroscopic imaging using a focal plane array detector has been used to study atherosclerotic arteries with a spatial resolution of 3-4 mum, i.e., at a level that is comparable with cellular dimensions. Such high spatial resolution is made possible using a micro-attenuated total reflection (ATR) germanium objective with a high refractive index and therefore high numerical aperture. This micro-ATR approach has enabled small structures within the vessel wall to be imaged for the first time by FTIR. Structures observed include the elastic lamellae of the tunica media and a heterogeneous distribution of small clusters of cholesterol esters within an atherosclerotic lesion, which may correspond to foam cells. A macro-ATR imaging method was also applied, which involves the use of a diamond macro-ATR accessory. This study of atherosclerosis is presented as an illustrative example of the wider potential of these A TR imaging approaches for cardiovascular medicine and biomedical applications. (C) 2004 Wiley Periodicals, Inc.
Resumo:
The acute hippocampal brain slice preparation is an important in vitro screening tool for potential anticonvulsants. Application of 4-aminopyridine (4-AP) or removal of external Mg2+ ions induces epileptiform bursting in slices which is analogous to electrical brain activity seen in status epilepticus states. We have developed these epileptiform models for use with multi-electrode arrays (MEAs), allowing recording across the hippocampal slice surface from 59 points. We present validation of this novel approach and analyses using two anticonvulsants, felbamate and phenobarbital, the effects of which have already been assessed in these models using conventional extracellular recordings. In addition to assessing drug effects on commonly described parameters (duration, amplitude and frequency), we describe novel methods using the MEA to assess burst propagation speeds and the underlying frequencies that contribute to the epileptiform activity seen. Contour plots are also used as a method of illustrating burst activity. Finally, we describe hitherto unreported properties of epileptiform, bursting induced by 100 mu M 4-AP or removal of external Mg2+ ions. Specifically, we observed decreases over time in burst amplitude and increase over time in burst frequency in the absence of additional pharmacological interventions. These MEA methods enhance the depth, quality and range of data that can be derived from the hippocampal slice preparation compared to conventional extracellular recordings. it may also uncover additional modes of action that contribute to anti-epileptiform drug effects. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
How can a bridge be built between autonomic computing approaches and parallel computing system? The work reported in this paper is motivated towards bridging this gap by proposing swarm-array computing, a novel technique to achieve autonomy for distributed parallel computing systems. Among three proposed approaches, the second approach, namely 'Intelligent Agents' is of focus in this paper. The task to be executed on parallel computing cores is considered as a swarm of autonomous agents. A task is carried to a computing core by carrier. agents and can be seamlessly transferred between cores in the event of a pre-dicted failure, thereby achieving self-ware objectives of autonomic computing. The feasibility of the proposed approach is validated on a multi-agent simulator.
Resumo:
The work reported in this paper proposes 'Intelligent Agents', a Swarm-Array computing approach focused to apply autonomic computing concepts to parallel computing systems and build reliable systems for space applications. Swarm-array computing is a robotics a swarm robotics inspired novel computing approach considered as a path to achieve autonomy in parallel computing systems. In the intelligent agent approach, a task to be executed on parallel computing cores is considered as a swarm of autonomous agents. A task is carried to a computing core by carrier agents and can be seamlessly transferred between cores in the event of a predicted failure, thereby achieving self-* objectives of autonomic computing. The approach is validated on a multi-agent simulator.
Resumo:
How can a bridge be built between autonomic computing approaches and parallel computing systems? How can autonomic computing approaches be extended towards building reliable systems? How can existing technologies be merged to provide a solution for self-managing systems? The work reported in this paper aims to answer these questions by proposing Swarm-Array Computing, a novel technique inspired from swarm robotics and built on the foundations of autonomic and parallel computing paradigms. Two approaches based on intelligent cores and intelligent agents are proposed to achieve autonomy in parallel computing systems. The feasibility of the proposed approaches is validated on a multi-agent simulator.