923 resultados para Space-time block codes
Resumo:
The periodic domains formed by block copolymer melts have been heralded as potential scaffolds for arranging nanoparticles in 3d space, provided we can control the positioning of the particles. Recent experiments have located particles at the domain interfaces by grafting mixed brushes to their surfaces. Here the underlying mechanism, which involves the transformation into Janus particles, is investigated with self-consistent field theory using a new multi-coordinate-system algorithm.
Resumo:
Infrared multilayer interference filters have been used extensively in satellite radiometers for about 15 years. Filters manufactured by the University of Reading have been used in Nimbus 5, 6, and 7, TIROS N, and the Pioneer Venus orbiter. The ability of the filters to withstand the space environment in these applications is critical; if degradation takes place, the effects would range from worsening of signal-to-noise performance to complete system failure. An experiment on the LDEF will enable the filters, for the first time, to be subjected to authoritative spectral measurements following space exposure to ascertain their suitability for spacecraft use and to permit an understanding of degradation mechanisms.
Resumo:
Residual stress having been further reduced, selected infrared coatings composed of thin films of (PbTe/ ZnS (or ZnSe) can now be made which comply with the durability requirements of MIL-48616 whilst retaining transparency. Such improved durability is due to the sequence:- i) controlled deposition, followed by ii) immediate exposure to air, followed by iii) annealing in vacuo to relieve stress. (At the time of writing we assume the empiric procedure "exposure to air/annealing in vacuo" acts to relieve the inherent stresses of deposition). As part of their testing, representative sample filters prepared by the procedure are being assembled for the shuttle's 1st Long Duration Exposure Facility (to be placed in earth orbit for a considerable period and then recovered for analysis). The sample filters comprise various narrowband-designs to permit deduction of the constituent thin film optical properties. The Reading assembly also contains representative sample of the infrared crystals, glasses, thin-film absorbers and bulk absorbers, and samples of shorter-wavelength filters prepared similarly but made with Ge/SiO. Findings on durability and transparency after exposure will be reported.
Resumo:
A polystyrene-block-poly(ferrocenylethylmethylsilane) diblock copolymer, displaying a double-gyroid morphology when self-assembled in the solid state, has been prepared with a PFEMS volume fraction phi(PFMS)=0.39 and a total molecular weight of 64 000 Da by sequential living anionic polymerisation. A block copolymer with a metal-containing block with iron and silicon in the main chain was selected due to its plasma etch resistance compared to the organic block. Self-assembly of the diblock copolymer in the bulk showed a stable, double-gyroid morphology as characterised by TEM. SAXS confirmed that the structure belonged to the Ia3d space group.
Resumo:
The Earth-directed coronal mass ejection (CME) of 8 April 2010 provided an opportunity for space weather predictions from both established and developmental techniques to be made from near–real time data received from the SOHO and STEREO spacecraft; the STEREO spacecraft provide a unique view of Earth-directed events from outside the Sun-Earth line. Although the near–real time data transmitted by the STEREO Space Weather Beacon are significantly poorer in quality than the subsequently downlinked science data, the use of these data has the advantage that near–real time analysis is possible, allowing actual forecasts to be made. The fact that such forecasts cannot be biased by any prior knowledge of the actual arrival time at Earth provides an opportunity for an unbiased comparison between several established and developmental forecasting techniques. We conclude that for forecasts based on the STEREO coronagraph data, it is important to take account of the subsequent acceleration/deceleration of each CME through interaction with the solar wind, while predictions based on measurements of CMEs made by the STEREO Heliospheric Imagers would benefit from higher temporal and spatial resolution. Space weather forecasting tools must work with near–real time data; such data, when provided by science missions, is usually highly compressed and/or reduced in temporal/spatial resolution and may also have significant gaps in coverage, making such forecasts more challenging.
Resumo:
A quasi-optical de-embedding technique for characterizing waveguides is demonstrated using wideband time-resolved terahertz spectroscopy. A transfer function representation is adopted for the description of the signal in the input and output port of the waveguides. The time domain responses were discretised and the waveguide transfer function was obtained through a parametric approach in the z-domain after describing the system with an ARX as well as with a state space model. Prior to the identification procedure, filtering was performed in the wavelet domain to minimize signal distortion and the noise propagating in the ARX and subspace models. The model identification procedure requires isolation of the phase delay in the structure and therefore the time-domain signatures must be firstly aligned with respect to each other before they are compared. An initial estimate of the number of propagating modes was provided by comparing the measured phase delay in the structure with theoretical calculations that take into account the physical dimensions of the waveguide. Models derived from measurements of THz transients in a precision WR-8 waveguide adjustable short will be presented.
Resumo:
This paper concerns the switching on of two-dimensional time-harmonic scalar waves. We first review the switch-on problem for a point source in free space, then proceed to analyse the analogous problem for the diffraction of a plane wave by a half-line (the ‘Sommerfeld problem’), determining in both cases the conditions under which the field is well-approximated by the solution of the corresponding frequency domain problem. In both cases the rate of convergence to the frequency domain solution is found to be dependent on the strength of the singularity on the leading wavefront. In the case of plane wave diffraction at grazing incidence the frequency domain solution is immediately attained along the shadow boundary after the arrival of the leading wavefront. The case of non-grazing incidence is also considered.
Resumo:
A near real-time flood detection algorithm giving a synoptic overview of the extent of flooding in both urban and rural areas, and capable of working during night-time and day-time even if cloud was present, could be a useful tool for operational flood relief management and flood forecasting. The paper describes an automatic algorithm using high resolution Synthetic Aperture Radar (SAR) satellite data that assumes that high resolution topographic height data are available for at least the urban areas of the scene, in order that a SAR simulator may be used to estimate areas of radar shadow and layover. The algorithm proved capable of detecting flooding in rural areas using TerraSAR-X with good accuracy, and in urban areas with reasonable accuracy.
Resumo:
The Earth's climate is undoubtedly changing; however, the time scale, consequences and causal attribution remain the subject of significant debate and uncertainty. Detection of subtle indicators from a background of natural variability requires measurements over a time base of decades. This places severe demands on the instrumentation used, requiring measurements of sufficient accuracy and sensitivity that can allow reliable judgements to be made decades apart. The International System of Units (SI) and the network of National Metrology Institutes were developed to address such requirements. However, ensuring and maintaining SI traceability of sufficient accuracy in instruments orbiting the Earth presents a significant new challenge to the metrology community. This paper highlights some key measurands and applications driving the uncertainty demand of the climate community in the solar reflective domain, e.g. solar irradiances and reflectances/radiances of the Earth. It discusses how meeting these uncertainties facilitate significant improvement in the forecasting abilities of climate models. After discussing the current state of the art, it describes a new satellite mission, called TRUTHS, which enables, for the first time, high-accuracy SI traceability to be established in orbit. The direct use of a ‘primary standard’ and replication of the terrestrial traceability chain extends the SI into space, in effect realizing a ‘metrology laboratory in space’.
Resumo:
The recent solar minimum was the longest and deepest of the space age, with the lowest average sunspot numbers for nearly a century. The Sun appears to be exiting a grand solar maximum (GSM) of activity which has persisted throughout the space age, and is headed into a significantly quieter period. Indeed, initial observations of solar cycle 24 (SC24) continue to show a relatively low heliospheric magnetic field strength and sunspot number (R), despite the average latitude of sunspots and the inclination of the heliospheric current sheet showing the rise to solar maximum is well underway. We extrapolate the available SC24 observations forward in time by assuming R will continue to follow a similar form to previous cycles, despite the end of the GSM, and predict a very weak cycle 24, with R peaking at ∼65–75 around the middle/end of 2012. Similarly, we estimate the heliospheric magnetic field strength will peak around 6nT. We estimate that average galactic cosmic ray fluxes above 1GV rigidity will be ∼10% higher in SC24 than SC23 and that the probability of a large SEP event during this cycle is 0.8, compared to 0.5 for SC23. Comparison of the SC24 R estimates with previous ends of GSMs inferred from 9300 years of cosmogenic isotope data places the current evolution of the Sun and heliosphere in the lowest 5% of cases, suggesting Maunder Minimum conditions are likely within the next 40 years.
Resumo:
The coarse spacing of automatic rain gauges complicates near-real- time spatial analyses of precipitation. We test the possibility of improving such analyses by considering, in addition to the in situ measurements, the spatial covariance structure inferred from past observations with a denser network. To this end, a statistical reconstruction technique, reduced space optimal interpolation (RSOI), is applied over Switzerland, a region of complex topography. RSOI consists of two main parts. First, principal component analysis (PCA) is applied to obtain a reduced space representation of gridded high- resolution precipitation fields available for a multiyear calibration period in the past. Second, sparse real-time rain gauge observations are used to estimate the principal component scores and to reconstruct the precipitation field. In this way, climatological information at higher resolution than the near-real-time measurements is incorporated into the spatial analysis. PCA is found to efficiently reduce the dimensionality of the calibration fields, and RSOI is successful despite the difficulties associated with the statistical distribution of daily precipitation (skewness, dry days). Examples and a systematic evaluation show substantial added value over a simple interpolation technique that uses near-real-time observations only. The benefit is particularly strong for larger- scale precipitation and prominent topographic effects. Small-scale precipitation features are reconstructed at a skill comparable to that of the simple technique. Stratifying the reconstruction method by the types of weather type classifications yields little added skill. Apart from application in near real time, RSOI may also be valuable for enhancing instrumental precipitation analyses for the historic past when direct observations were sparse.
Resumo:
The real-time parallel computation of histograms using an array of pipelined cells is proposed and prototyped in this paper with application to consumer imaging products. The array operates in two modes: histogram computation and histogram reading. The proposed parallel computation method does not use any memory blocks. The resulting histogram bins can be stored into an external memory block in a pipelined fashion for subsequent reading or streaming of the results. The array of cells can be tuned to accommodate the required data path width in a VLSI image processing engine as present in many imaging consumer devices. Synthesis of the architectures presented in this paper in FPGA are shown to compute the real-time histogram of images streamed at over 36 megapixels at 30 frames/s by processing in parallel 1, 2 or 4 pixels per clock cycle.
Resumo:
To achieve CO2 emissions reductions the UK Building Regulations require developers of new residential buildings to calculate expected CO2 emissions arising from their energy consumption using a methodology such as Standard Assessment Procedure (SAP 2005) or, more recently SAP 2009. SAP encompasses all domestic heat consumption and a limited proportion of the electricity consumption. However, these calculations are rarely verified with real energy consumption and related CO2 emissions. This paper presents the results of an analysis based on weekly head demand data for more than 200 individual flats. The data is collected from recently built residential development connected to a district heating network. A methodology for separating out the domestic hot water use (DHW) and space heating demand (SH) has been developed and compares measured values to the demand calculated using SAP 2005 and 2009 methodologies. The analysis shows also the variance in DHW and SH consumption between both size of the flats and tenure (privately owned or housing association). Evaluation of the space heating consumption includes also an estimation of the heating degree day (HDD) base temperature for each block of flats and its comparison to the average base temperature calculated using the SAP 2005 methodology.
Resumo:
Models of the City of London office market are extended by considering a longer time series of data, covering two cycles, and by explicit modeling of asymmetric rental response to supply and demand model. A long run structural model linking demand for office space, real rental levels and office-based employment is estimated and then rental adjustment processes are modeled using an error correction model framework. Adjustment processes are seen to be asymmetric, dependent both on the direction of the supply and demand shock and on the state of the rental market at the time of the shock. A complete system of equations is estimated: unit shocks produce oscillations but there is a return to a steady equilibrium state in the long run.
Resumo:
This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.