963 resultados para Athletics--Ontario--Merritton--History--Sources.


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of compression of a non-Abelian source.This is motivated by the problem of distributed function computation,where it is known that if one is only interested in computing a function of several sources, then one can often improve upon the compression rate required by the Slepian-Wolf bound. Let G be a non-Abelian group having center Z(G). We show here that it is impossible to compress a source with symbols drawn from G when Z(G) is trivial if one employs a homomorphic encoder and a typical-set decoder.We provide achievable upper bounds on the minimum rate required to compress a non-Abelian group with non-trivial center. Also, in a two source setting, we provide achievable upper bounds for compression of any non-Abelian group, using a non-homomorphic encoder.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During lightning strike to a tall grounded object (TGO), reflections of current waves are known to occur at either ends of the TGO. These reflection modify the channel current and hence, the lightning electromagnetic fields. This study aims to identify the possible contributing factors to reflection at a TGO-channel junction for the current waves ascending on the TGO. Possible sources of reflection identified are corona sheath and discontinuity of resistance and radius. For analyzing the contribution of corona sheath and discontinuity of resistance at the junction, a macroscopic physical model for the return stroke developed in our earlier work is employed. NEC-2D is used for assessing the contribution of abrupt change in radii at a TGO-channel junction. The wire-cage model adopted for the same is validated using laboratory experiments. Detailed investigation revealed the following. The main contributor for reflection at a TGO-channel junction is the difference between TGO and channel core radii. Also, the discontinuity of resistance at a TGO-channel junction can be of some relevance only for the first microsecond regime. Further, corona sheath does not play any significant role in the reflection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we address the problem of transmission of correlated sources over a fading multiple access channel (MAC). We provide sufficient conditions for transmission with given distortions. Next these conditions are specialized to a Gaussian MAC (GMAC). Transmission schemes for discrete and Gaussian sources over a fading GMAC are considered. Various power allocation strategies are also compared. Keywords: Fading MAC, Power allocation, Random TDMA, Amplify and Forward, Correlated sources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of distributed joint source-channel coding of correlated Gaussian sources over a Gaussian Multiple Access Channel (MAC). There may be side information at the encoders and/or at the decoder. First we specialize a general result in [16] to obtain sufficient conditions for reliable transmission over a Gaussian MAC. This system does not satisfy the source channel separation. Thus, next we study and compare three joint source channel coding schemes available in literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Indian Ocean earthquake of 26 December 2004 led to significant ground deformation in the Andaman and Nicobar region, accounting for ~800 km of the rupture. Part of this article deals with coseismic changes along these islands, observable from coastal morphology, biological indicators, and Global Positioning System (GPS) data. Our studies indicate that the islands south of 10° N latitude coseismically subsided by 1–1.5 m, both on their eastern and western margins, whereas those to the north showed a mixed response. The western margin of the Middle Andaman emerged by >1 m, and the eastern margin submerged by the same amount. In the North Andaman, both western and eastern margins emerged by >1 m. We also assess the pattern of long-term deformation (uplift/subsidence) and attempt to reconstruct earthquake/tsunami history, with the available data. Geological evidence for past submergence includes dead mangrove vegetation dating to 740 ± 100 yr B.P., near Port Blair and peat layers at 2–4 m and 10–15 m depths observed in core samples from nearby locations. Preliminary paleoseismological/tsunami evidence from the Andaman and Nicobar region and from the east coast of India, suggest at least one predecessor for the 2004 earthquake 900–1000 years ago. The history of earthquakes, although incomplete at this stage, seems to imply that the 2004-type earthquakes are infrequent and follow variable intervals

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses the efficiency and productivity growth of the Electronic Sector of India in the liberalization era since 1991. The study gives an insight into the process of the growth of one of the most upcoming sector of this decade. This sector has experienced a vast structural change along with the changing economic structures in India after liberalisation. With the opening up of this sector to foreign market and incoming of multinational companies, the environment has become highly competitive. The law that operates is that of Darwin’s ‘Survival of the fittest’. Existing industries experience a continuous threat of exit due to entrance of new potential entrants. Thus, it becomes inevitable for the existing industries in this sector to improve productivity growth for their survival. It is thus important to analyze how the industries in this sector have performed over the years and what are the factors that have contributed to the overall output growth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of electroacoustic analogies suggests that a source of acoustical energy (such as an engine, compressor, blower, turbine, loudspeaker, etc.) can be characterized by an acoustic source pressure ps and internal source impedance Zs, analogous to the open-circuit voltage and internal impedance of an electrical source. The present paper shows analytically that the source characteristics evaluated by means of the indirect methods are independent of the loads selected; that is, the evaluated values of ps and Zs are unique, and that the results of the different methods (including the direct method) are identical. In addition, general relations have been derived here for the transfer of source characteristics from one station to another station across one or more acoustical elements, and also for combining several sources into a single equivalent source. Finally, all the conclusions are extended to the case of a uniformly moving medium, incorporating the convective as well as dissipative effects of the mean flow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we explore the use of LDPC codes for nonuniform sources under distributed source coding paradigm. Our analysis reveals that several capacity approaching LDPC codes indeed do approach the Slepian-Wolf bound for nonuniform sources as well. The Monte Carlo simulation results show that highly biased sources can be compressed to 0.049 bits/sample away from Slepian-Wolf bound for moderate block lengths.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The setting considered in this paper is one of distributed function computation. More specifically, there is a collection of N sources possessing correlated information and a destination that would like to acquire a specific linear combination of the N sources. We address both the case when the common alphabet of the sources is a finite field and the case when it is a finite, commutative principal ideal ring with identity. The goal is to minimize the total amount of information needed to be transmitted by the N sources while enabling reliable recovery at the destination of the linear combination sought. One means of achieving this goal is for each of the sources to compress all the information it possesses and transmit this to the receiver. The Slepian-Wolf theorem of information theory governs the minimum rate at which each source must transmit while enabling all data to be reliably recovered at the receiver. However, recovering all the data at the destination is often wasteful of resources since the destination is only interested in computing a specific linear combination. An alternative explored here is one in which each source is compressed using a common linear mapping and then transmitted to the destination which then proceeds to use linearity to directly recover the needed linear combination. The article is part review and presents in part, new results. The portion of the paper that deals with finite fields is previously known material, while that dealing with rings is mostly new.Attempting to find the best linear map that will enable function computation forces us to consider the linear compression of source. While in the finite field case, it is known that a source can be linearly compressed down to its entropy, it turns out that the same does not hold in the case of rings. An explanation for this curious interplay between algebra and information theory is also provided in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The spectral index-luminosity relationship for steep-spectrum cores in galaxies and quasars has been investigated, and it is found that the sample of galaxies supports earlier suggestions of a strong correlation, while there is weak evidence for a similar relationship for the quasars. It is shown that a strong spectral index-luminosity correlation can be used to set an upper limit to the velocities of the radio-emitting material which is expelled from the nucleus in the form of collimated beams or jets having relativistic bulk velocities. The data on cores in galaxies indicate that the Lorentz factors of the radiating material are less than about 2.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper studies were carried out on two compact electric discharge plasma sources for controlling nitrogen oxides (NOX) emission in diesel engine exhaust. The plasma sources consist of an old television flyback transformer to generate high frequency high voltage ac (HVAC) and an automobile ignition coil to generate the high voltage pulses (HV Pulse). The compact plasma sources are aimed at retrofitting the existing catalytic converters with electric discharge assisted cleaning technique. To enhance NOX removal efficiency cascaded plasma-adsorbent technique has been used. Studies were reported at different flow rates and load conditions of the diesel engine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coenzyme Q (ubiquinone), a fully substituted benzoquinone with polyprenyl side chain, participates in many cellular redox activities. Paradoxically it was discovered only in 1957, albeit being ubiquitous. It required a person, F. L. Crane, a place, Enzyme Institute, Madison, USA, and a time when D. E. Green was directing vigorous research on mitochondria. Located at the transition of 2-electron flavoproteins and 1-electron cytochrome carriers, it facilitates electron transfer through the elegant Q-cycle in mitochondria to reduce O-2 to H2O, and to H2O2, now a significant signal-transducing agent, as a minor activity in shunt pathway (animals) and alternative oxidase (plants). The ability to form Q-radical by losing an electron and a proton was ingeniously used by Mitchell to explain the formation of the proton gradient, considered the core of energy transduction, and also in acidification in vacuoles. Known to be a mobile membrane constituent (microsomes, plasma membrane and Golgi apparatus), allowing it to reach multiple sites, coenzyme Q is expected to have other activities. Coenzyme Q protects circulating lipoproteins being a better lipid antioxidant than even vitamin E. Binding to proteins such as QPS, QPN, QPC and uncoupling protein in mitochondria, QA and QB in the reaction centre in R. sphaeroides, and disulfide bond-forming protein in E. coli (possibly also in Golgi), coenzyme Q acquires selective functions. A characteristic of orally dosed coenzyme Q is its exclusive absorption into the liver, but not the other tissues. This enrichment of Q is accompanied by significant decrease of blood pressure and of serum cholesterol. Inhibition of formation of mevalonate, the common precursor in the branched isoprene pathway, by the minor product, coenzyme Q, decreases the major product, cholesterol. Relaxation of contracted arterial smooth muscle by a side-chain truncated product of coenzyme Q explains its effect of decreasing blood pressure. Extensive clinical studies carried out on oral supplements of coenzyine Q, initially by K. Folkers and Y. Yamamura and followed many others, revealed a large number of beneficial effects, significantly in cardiovascular diseases. Such a variety of effects by this lipid quinone cannot depend on redox activity alone. The fat-soluble vitamins (A, D, E and K) that bear structural relationship with coenzyme Q are known to be active in their polar forms. A vignette of modified forms of coenzyme Q taking active role in its multiple effects is emerging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study presents an analysis aimed at choosing between off-grid solar photovoltaic, biomass gasifier based power generation and conventional grid extension for remote village electrification. The model provides a relation between renewable energy systems and the economical distance limit (EDL) from the existing grid point, based on life cycle cost (LCC) analysis, where the LCC of energy for renewable energy systems and grid extension will match. The LCC of energy feed to the village is arrived at by considering grid availability and operating hours of the renewable energy systems. The EDL for the biomass gasifier system of 25 kW capacities is 10.5 km with 6 h of daily operation and grid availability. However, the EDL for a similar 25 kW capacity photovoltaic system is 35 km for the same number of hours of operation and grid availability. The analysis shows that for villages having low load demand situated far away from the existing grid line, biomass gasification based systems are more cost competitive than photovoltaic systems or even compared to grid extension. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: India has the third largest HIV-1 epidemic with 2.4 million infected individuals. Molecular epidemiological analysis has identified the predominance of HIV-1 subtype C (HIV-1C). However, the previous reports have been limited by sample size, and uneven geographical distribution. The introduction of HIV-1C in India remains uncertain due to this lack of structured studies. To fill the gap, we characterised the distribution pattern of HIV-1 subtypes in India based on data collection from nationwide clinical cohorts between 2007 and 2011. We also reconstructed the time to the most recent common ancestor (tMRCA) of the predominant HIV-1C strains. Methodology/Principal Findings: Blood samples were collected from 168 HIV-1 seropositive subjects from 7 different states. HIV-1 subtypes were determined using two or three genes, gag, pol, and env using several methods. Bayesian coalescent-based approach was used to reconstruct the time of introduction and population growth patterns of the Indian HIV-1C. For the first time, a high prevalence (10%) of unique recombinant forms (BC and A1C) was observed when two or three genes were used instead of one gene (p<0.01; p = 0.02, respectively). The tMRCA of Indian HIV-1C was estimated using the three viral genes, ranged from 1967 (gag) to 1974 (env). Pol-gene analysis was considered to provide the most reliable estimate 1971, (95% CI: 1965-1976)]. The population growth pattern revealed an initial slow growth phase in the mid-1970s, an exponential phase through the 1980s, and a stationary phase since the early 1990s. Conclusions/Significance: The Indian HIV-1C epidemic originated around 40 years ago from a single or few genetically related African lineages, and since then largely evolved independently. The effective population size in the country has been broadly stable since the 1990s. The evolving viral epidemic, as indicated by the increase of recombinant strains, warrants a need for continued molecular surveillance to guide efficient disease intervention strategies.