100 resultados para Soft constraints
em CentAUR: Central Archive University of Reading - UK
Resumo:
In 1997, the United Kingdom started the world's first commercial digital terrestrial television service. The system used was the European Digital Video Broadcast - Terrestrial (DVB-T) but due to technological constraints at the time, the system chosen was the 2K system - a system that uses 1705 carriers to convey the digital television services through a hostile terrestrial environment. Today, these constraints are no longer applicable but in order to maintain backwards compatibility to the older set top boxes, the 2K system is still used. The 2K system has the disadvantage of excluding the possibiliiy of employing a Single Frequency Network (SFN) - something that can help minimise the required bandwidth for television services. This paper will demonstrate a computationally inexpensive soft decision Quadrature Amplitude Modulation technique that can reject the multipaths. (1).
Resumo:
In this paper extensions to an existing tracking algorithm are described. These extensions implement adaptive tracking constraints in the form of regional upper-bound displacements and an adaptive track smoothness constraint. Together, these constraints make the tracking algorithm more flexible than the original algorithm (which used fixed tracking parameters) and provide greater confidence in the tracking results. The result of applying the new algorithm to high-resolution ECMWF reanalysis data is shown as an example of its effectiveness.
Resumo:
The formulation of four-dimensional variational data assimilation allows the incorporation of constraints into the cost function which need only be weakly satisfied. In this paper we investigate the value of imposing conservation properties as weak constraints. Using the example of the two-body problem of celestial mechanics we compare weak constraints based on conservation laws with a constraint on the background state.We show how the imposition of conservation-based weak constraints changes the nature of the gradient equation. Assimilation experiments demonstrate how this can add extra information to the assimilation process, even when the underlying numerical model is conserving.
Resumo:
Recent attempts to problematize archaeological fieldwork concerned with excavation at the expense of surface survey, and with questions of procedure more than interpretations of the past. In fact these two kinds of fieldwork offer quite different possibilities and suffer from different constraints. Thought must be given to ways in which they can be combined if they are to make a real contribution to social archaeology. The argument is illustrated by a project carried out at a megalithic cemetery in Scotland.
Resumo:
Elucidating the controls on the location and vigor of ice streams is crucial to understanding the processes that lead to fast disintegration of ice flows and ice sheets. In the former North American Laurentide ice sheet, ice stream occurrence appears to have been governed by topographic troughs or areas of soft-sediment geology. This paper reports robust evidence of a major paleo-ice stream over the northwestern Canadian Shield, an area previously assumed to be incompatible with fast ice flow because of the low relief and relatively hard bedrock. A coherent pattern of subglacial bedforms (drumlins and megascalle glacial lineations) demarcates the ice stream flow set, which exhibits a convergent onset zone, a narrow main trunk with abrupt lateral margins, and a lobate terminus. Variations in bedform elongation ratio within the flow set match theoretical expectations of ice velocity. In the center of the ice stream, extremely parallel megascalle glacial lineations tens of kilometers long with elongation ratios in excess of 40:1 attest to a single episode of rapid ice flow. We conclude that while bed properties are likely to be influential in determining the occurrence and vigor of ice streams, contrary to established views, widespread soft-bed geology is not an essential requirement for those ice streams without topographic control. We speculate that the ice stream acted as a release valve on ice-sheet mass balance and was initiated by the presence of a proglacial lake that destabilized the ice-sheet margin and propagated fast ice flow through a series of thermomechanical feedbacks involving ice flow and temperature.
Resumo:
This paper investigates the impact of aerosol forcing uncertainty on the robustness of estimates of the twentieth-century warming attributable to anthropogenic greenhouse gas emissions. Attribution analyses on three coupled climate models with very different sensitivities and aerosol forcing are carried out. The Third Hadley Centre Coupled Ocean - Atmosphere GCM (HadCM3), Parallel Climate Model (PCM), and GFDL R30 models all provide good simulations of twentieth-century global mean temperature changes when they include both anthropogenic and natural forcings. Such good agreement could result from a fortuitous cancellation of errors, for example, by balancing too much ( or too little) greenhouse warming by too much ( or too little) aerosol cooling. Despite a very large uncertainty for estimates of the possible range of sulfate aerosol forcing obtained from measurement campaigns, results show that the spatial and temporal nature of observed twentieth-century temperature change constrains the component of past warming attributable to anthropogenic greenhouse gases to be significantly greater ( at the 5% level) than the observed warming over the twentieth century. The cooling effects of aerosols are detected in all three models. Both spatial and temporal aspects of observed temperature change are responsible for constraining the relative roles of greenhouse warming and sulfate cooling over the twentieth century. This is because there are distinctive temporal structures in differential warming rates between the hemispheres, between land and ocean, and between mid- and low latitudes. As a result, consistent estimates of warming attributable to greenhouse gas emissions are obtained from all three models, and predictions are relatively robust to the use of more or less sensitive models. The transient climate response following a 1% yr(-1) increase in CO2 is estimated to lie between 2.2 and 4 K century(-1) (5-95 percentiles).
Resumo:
In this work a new method for clustering and building a topographic representation of a bacteria taxonomy is presented. The method is based on the analysis of stable parts of the genome, the so-called “housekeeping genes”. The proposed method generates topographic maps of the bacteria taxonomy, where relations among different type strains can be visually inspected and verified. Two well known DNA alignement algorithms are applied to the genomic sequences. Topographic maps are optimized to represent the similarity among the sequences according to their evolutionary distances. The experimental analysis is carried out on 147 type strains of the Gammaprotebacteria class by means of the 16S rRNA housekeeping gene. Complete sequences of the gene have been retrieved from the NCBI public database. In the experimental tests the maps show clusters of homologous type strains and present some singular cases potentially due to incorrect classification or erroneous annotations in the database.
Resumo:
An Orthogonal Frequency Division Multiplexing (OFDM) communication system with a transmitter and a receiver. The transmitter is arranged to transmit channel estimation sequences on each of a plurality of band groups, or bands, and to transmit data on each of the band groups or bands. The receiver is arranged to receive the channel estimation sequences for each band group or band to calculate channel state information from each of the channel estimation sequences transmitted on that band group or band and to form an average channel state information. The receiver receives the transmitted data, transforms the received data into the frequency domain, equalizes the received data using the channel state information, demaps the equalized data to re-construct the received data as soft bits and modifies the soft bits using the averaged channel state information.
Resumo:
A rapid capillary electrophoresis method was developed simultaneously to determine artificial sweeteners, preservatives and colours used as additives in carbonated soft drinks. Resolution between all additives occurring together in soft drinks was successfully achieved within a 15-min run-time by employing the micellar electrokinetic chromatography mode with a 20 mM carbonate buffer at pH 9.5 as the aqueous phase and 62 mM sodium dodecyl sulfate as the micellar phase. By using a diode-array detector to monitor the UV-visible range (190-600 nm), the identity of sample components, suggested by migration time, could be confirmed by spectral matching relative to standards.
Resumo:
Development policies in the pastoral areas of Africa assume that pastoralists are poor. Using the Afar pastoralists of Ethiopia as the focus of research this article challenges this depiction of pastoralism by exploring pastoral livelihood goals and traditional strategies for managing risk. Investment in social institutions to minimise the risk of outright destitution, sometimes at the cost of increased poverty, and significant manipulation of local markets enable the Afar to exploit a highly uncertain and marginal environment. Improved development assistance and enhanced targeting of the truly vulnerable within pastoral societies demands an acceptance that pastoral poverty is neither uniform nor universal.