988 resultados para successive linearization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Agricultural soils are the dominant contributor to increases in atmospheric nitrous oxide (N2O). Few studies have investigated the natural N and O isotopic composition of soil N2O. We collected soil gas samples using horizontal sampling tubes installed at successive depths under five contrasting agricultural crops (e.g., unamended alfalfa, fertilized cereal), and tropospheric air samples. Mean d 15N and d 18O values of soil N2O ranged from -28.0 to +8.9‰, and from +29.0 to +53.6‰. The mean d 15N and d 18O values of tropospheric N2O were +4.6 ± 0.7‰ and +48.3 ± 0.2‰, respectively. In general, d values were lowest at depth, they were negatively correlated to soil [N2O], and d 15N was positively correlated to d 18O for every treatment on all sampling dates. N2O from the different agricultural treatments had distinct d 15N and d 18O values that varied among sampling dates. Fertilized treatments had soil N2O with low d values, but the unamended alfalfa yielded N2O with the lowest d values. Diffusion was not the predominant process controlling N2O concentration profiles. Based on isotopic and concentration data, it appears that soil N2O was consumed, as it moved from deeper to shallower soil layers. To better assess the main process(es) controlling N2O within a soil profile, we propose a conceptual model that integrates data on net N2O production or consumption and isotopic data. The direct local impact of agricultural N2O on the isotopic composition of tropospheric N2O was recorded by a shift toward lower d values of locally measured tropospheric N2O on a day with very high soil N2O emissions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we address the problem of designing multirate codes for a multiple-input and multiple-output (MIMO) system by restricting the receiver to be a successive decoding and interference cancellation type, when each of the antennas is encoded independently. Furthermore, it is assumed that the receiver knows the instantaneous fading channel states but the transmitter does not have access to them. It is well known that, in theory, minimum-mean-square error (MMSE) based successive decoding of multiple access (in multi-user communications) and MIMO channels achieves the total channel capacity. However, for this scheme to perform optimally, the optimal rates of each antenna (per-antenna rates) must be known at the transmitter. We show that the optimal per-antenna rates at the transmitter can be estimated using only the statistical characteristics of the MIMO channel in time-varying Rayleigh MIMO channel environments. Based on the results, multirate codes are designed using punctured turbo codes for a horizontal coded MIMO system. Simulation results show performances within about one to two dBs of MIMO channel capacity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The agile model of software development has been mainstream for several years, and is now in a phase where its principles and practices are maturing. The purpose of this paper is to describe the results of an industry survey aimed at understanding how maturation is progressing. The survey was taken across 40 software development companies in Northern Ireland at the beginning of 2012. The paper describes the design of the survey and examines maturity by comparing the results obtained in 2012 with those from a study of agile adoption in the same region in 2010. Both surveys aimed to achieve comprehensive coverage of a single area rather than rely on a voluntary sample. The main outcome from the work is a collection of ‘insights’ into the nature and practice of agile development, the main two of which are reported in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Polar codes are one of the most recent advancements in coding theory and they have attracted significant interest. While they are provably capacity achieving over various channels, they have seen limited practical applications. Unfortunately, the successive nature of successive cancellation based decoders hinders fine-grained adaptation of the decoding complexity to design constraints and operating conditions. In this paper, we propose a systematic method for enabling complexity-performance trade-offs by constructing polar codes based on an optimization problem which minimizes the complexity under a suitably defined mutual information based performance constraint. Moreover, a low-complexity greedy algorithm is proposed in order to solve the optimization problem efficiently for very large code lengths.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, we have mainly achieved the following: 1. we provide a review of the main methods used for the computation of the connection and linearization coefficients between orthogonal polynomials of a continuous variable, moreover using a new approach, the duplication problem of these polynomial families is solved; 2. we review the main methods used for the computation of the connection and linearization coefficients of orthogonal polynomials of a discrete variable, we solve the duplication and linearization problem of all orthogonal polynomials of a discrete variable; 3. we propose a method to generate the connection, linearization and duplication coefficients for q-orthogonal polynomials; 4. we propose a unified method to obtain these coefficients in a generic way for orthogonal polynomials on quadratic and q-quadratic lattices. Our algorithmic approach to compute linearization, connection and duplication coefficients is based on the one used by Koepf and Schmersau and on the NaViMa algorithm. Our main technique is to use explicit formulas for structural identities of classical orthogonal polynomial systems. We find our results by an application of computer algebra. The major algorithmic tools for our development are Zeilberger’s algorithm, q-Zeilberger’s algorithm, the Petkovšek-van-Hoeij algorithm, the q-Petkovšek-van-Hoeij algorithm, and Algorithm 2.2, p. 20 of Koepf's book "Hypergeometric Summation" and it q-analogue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The anxiolytic properties of ethanol (1 g/kg, 15% dose, i.p.) were studied in two experiments with rats involving incentive downshifts from a 32% to a 4% sucrose solution. In Experiment 1, alcohol administration before a downshift from 32% to 4% sucrose prevented the development of consummatory suppression (consummatory successive negative contrast, cSNC). In Experiment 2, ethanol prevented the attenuating effects of partial reinforcement (random sequence of 32% sucrose and nothing) on cSNC, causing a retardation of recovery from contrast. These effects of ethanol on cSNC are analogous to those described for the benzodiazepine anxiolytic chlordiazepoxide, suggesting that at least some of its anxiolytic effects are mediated by the same mechanisms. 

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bayesian inference has been used to determine rigorous estimates of hydroxyl radical concentrations () and air mass dilution rates (K) averaged following air masses between linked observations of nonmethane hydrocarbons (NMHCs) spanning the North Atlantic during the Intercontinental Transport and Chemical Transformation (ITCT)-Lagrangian-2K4 experiment. The Bayesian technique obtains a refined (posterior) distribution of a parameter given data related to the parameter through a model and prior beliefs about the parameter distribution. Here, the model describes hydrocarbon loss through OH reaction and mixing with a background concentration at rate K. The Lagrangian experiment provides direct observations of hydrocarbons at two time points, removing assumptions regarding composition or sources upstream of a single observation. The estimates are sharpened by using many hydrocarbons with different reactivities and accounting for their variability and measurement uncertainty. A novel technique is used to construct prior background distributions of many species, described by variation of a single parameter . This exploits the high correlation of species, related by the first principal component of many NMHC samples. The Bayesian method obtains posterior estimates of , K and following each air mass. Median values are typically between 0.5 and 2.0 × 106 molecules cm−3, but are elevated to between 2.5 and 3.5 × 106 molecules cm−3, in low-level pollution. A comparison of estimates from absolute NMHC concentrations and NMHC ratios assuming zero background (the “photochemical clock” method) shows similar distributions but reveals systematic high bias in the estimates from ratios. Estimates of K are ∼0.1 day−1 but show more sensitivity to the prior distribution assumed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Children with English as a second language (L2) with exposure of 18 months or less exhibit similar difficulties to children with Specific Language Impairment in tense marking, a marker of language impairment for English. This paper examines whether L2 children with longer exposure converge with their monolingual peers in the production of tense marking. 38 Turkish-English L2 children with a mean age of 7;8 and 33 monolingual age-matched controls completed the screening test of the Test of Early Grammatical Impairment (TEGI). The L2 children as a group were as accurate as the controls in the production of -ed, but performed significantly lower than the controls in the production of third person –s. Age and YoE affected the children’s performance. The highest age-expected performance on the TEGI was attested in eight and nine year-old children who had 4-6 YoE. L1 and L2 children performed better in regular compared to irregular verbs, but L2 children overregularized more than L1 children and were less sensitive to the phonological properties of verbs. The results show that tense marking and the screening test of the TEGI may be promising for differential diagnosis in eight and nine year-old L2 children with at least four YoE.