139 resultados para Maximum pseudo-likelihood
Resumo:
This chapter considers the Multiband Orthogonal Frequency Division Multiplexing (MB- OFDM) modulation and demodulation with the intention to optimize the Ultra-Wideband (UWB) system performance. OFDM is a type of multicarrier modulation and becomes the most important aspect for the MB-OFDM system performance. It is also a low cost digital signal component efficiently using Fast Fourier Transform (FFT) algorithm to implement the multicarrier orthogonality. Within the MB-OFDM approach, the OFDM modulation is employed in each 528 MHz wide band to transmit the data across the different bands while also using the frequency hopping technique across different bands. Each parallel bit stream can be mapped onto one of the OFDM subcarriers. Quadrature Phase Shift Keying (QPSK) and Dual Carrier Modulation (DCM) are currently used as the modulation schemes for MB-OFDM in the ECMA-368 defined UWB radio platform. A dual QPSK soft-demapper is suitable for ECMA-368 that exploits the inherent Time-Domain Spreading (TDS) and guard symbol subcarrier diversity to improve the receiver performance, yet merges decoding operations together to minimize hardware and power requirements. There are several methods to demap the DCM, which are soft bit demapping, Maximum Likelihood (ML) soft bit demapping, and Log Likelihood Ratio (LLR) demapping. The Channel State Information (CSI) aided scheme coupled with the band hopping information is used as a further technique to improve the DCM demapping performance. ECMA-368 offers up to 480 Mb/s instantaneous bit rate to the Medium Access Control (MAC) layer, but depending on radio channel conditions dropped packets unfortunately result in a lower throughput. An alternative high data rate modulation scheme termed Dual Circular 32-QAM that fits within the configuration of the current standard increasing system throughput thus maintaining the high rate throughput even with a moderate level of dropped packets.
Resumo:
The total phenols, apigenin 7-glucoside, turbidity and colour of extracts from dried chamomile flowers were studied with a view to develop chamomile extracts with potential anti-inflammatory properties for incorporation into beverages. The extraction of all constituents followed pseudo first-order kinetics. In general, the rate constant (k) increased as the temperature increased from 57 to 100 °C. The turbidity only increased significantly between 90 and 100 °C. Therefore, aqueous chamomile extracts had maximum total phenol concentration and minimum turbidity when extracted at 90 °C for 20 min. The effect of drying conditions on chamomile extracted using these conditions was determined. A significant reduction in phenol concentration, from 19.7 ± 0.5 mg/g GAE in fresh chamomile to 13 ± 1 mg/g GAE, was found only in the plant material oven-dried at 80 °C (p ⩽ 0.05). The biggest colour change was between fresh chamomile and that oven-dried at 80 °C, followed by samples air-dried. There was no significant difference in colour of material freeze-dried and oven-dried at 40 °C.
Resumo:
This article introduces generalized beta-generated (GBG) distributions. Sub-models include all classical beta-generated, Kumaraswamy-generated and exponentiated distributions. They are maximum entropy distributions under three intuitive conditions, which show that the classical beta generator skewness parameters only control tail entropy and an additional shape parameter is needed to add entropy to the centre of the parent distribution. This parameter controls skewness without necessarily differentiating tail weights. The GBG class also has tractable properties: we present various expansions for moments, generating function and quantiles. The model parameters are estimated by maximum likelihood and the usefulness of the new class is illustrated by means of some real data sets.
Resumo:
This study examines the numerical accuracy, computational cost, and memory requirements of self-consistent field theory (SCFT) calculations when the diffusion equations are solved with various pseudo-spectral methods and the mean field equations are iterated with Anderson mixing. The different methods are tested on the triply-periodic gyroid and spherical phases of a diblock-copolymer melt over a range of intermediate segregations. Anderson mixing is found to be somewhat less effective than when combined with the full-spectral method, but it nevertheless functions admirably well provided that a large number of histories is used. Of the different pseudo-spectral algorithms, the 4th-order one of Ranjan, Qin and Morse performs best, although not quite as efficiently as the full-spectral method.
Resumo:
The objective of this paper is to reconsider the Maximum Entropy Production conjecture (MEP) in the context of a very simple two-dimensional zonal-vertical climate model able to represent the total material entropy production due at the same time to both horizontal and vertical heat fluxes. MEP is applied first to a simple four-box model of climate which accounts for both horizontal and vertical material heat fluxes. It is shown that, under condition of fixed insolation, a MEP solution is found with reasonably realistic temperature and heat fluxes, thus generalising results from independent two-box horizontal or vertical models. It is also shown that the meridional and the vertical entropy production terms are independently involved in the maximisation and thus MEP can be applied to each subsystem with fixed boundary conditions. We then extend the four-box model by increasing its resolution, and compare it with GCM output. A MEP solution is found which is fairly realistic as far as the horizontal large scale organisation of the climate is concerned whereas the vertical structure looks to be unrealistic and presents seriously unstable features. This study suggest that the thermal meridional structure of the atmosphere is predicted fairly well by MEP once the insolation is given but the vertical structure of the atmosphere cannot be predicted satisfactorily by MEP unless constraints are imposed to represent the determination of longwave absorption by water vapour and clouds as a function of the state of the climate. Furthermore an order-of-magnitude estimate of contributions to the material entropy production due to horizontal and vertical processes within the climate system is provided by using two different methods. In both cases we found that approximately 40 mW m−2 K−1 of material entropy production is due to vertical heat transport and 5–7 mW m−2 K−1 to horizontal heat transport
Resumo:
We explore the influence of a rotating collector on the internal structure of poly(ε-caprolactone) fibres electrospun from a solution in dichloroethane. We find that above a threshold collector speed, the mean fibre diameter reduces as the speed increases and the fibres are further extended. Small-angle and wide-angle X-ray scattering techniques show a preferred orientation of the lamellar crystals normal to the fibre axis which increases with collector speed to a maximum and then reduces. We have separated out the processes of fibre alignment on the collector and the orientation of crystals within the fibres. There are several stages to this behaviour which correspond to the situations (a) where the collector speed is slower than the fibre spinning rate, (b) the fibre is mechanically extended by the rotating collector and (c) where the deformation leads to fibre fracture. The mechanical deformation leads to a development of preferred orientation with extension which is similar to the prediction of the pseudo-affine deformation model and suggests that the deformation takes place during the spinning process after the crystals have formed.
Resumo:
During the Last Glacial Maximum (LGM, ∼21,000 years ago) the cold climate was strongly tied to low atmospheric CO2 concentration (∼190 ppm). Although it is generally assumed that this low CO2 was due to an expansion of the oceanic carbon reservoir, simulating the glacial level has remained a challenge especially with the additional δ13C constraint. Indeed the LGM carbon cycle was also characterized by a modern-like δ13C in the atmosphere and a higher surface to deep Atlantic δ13C gradient indicating probable changes in the thermohaline circulation. Here we show with a model of intermediate complexity, that adding three oceanic mechanisms: brine induced stratification, stratification-dependant diffusion and iron fertilization to the standard glacial simulation (which includes sea level drop, temperature change, carbonate compensation and terrestrial carbon release) decreases CO2 down to the glacial value of ∼190 ppm and simultaneously matches glacial atmospheric and oceanic δ13C inferred from proxy data. LGM CO2 and δ13C can at last be successfully reconciled.
High resolution Northern Hemisphere wintertime mid-latitude dynamics during the Last Glacial Maximum
Resumo:
Hourly winter weather of the Last Glacial Maximum (LGM) is simulated using the Community Climate Model version 3 (CCM3) on a globally resolved T170 (75 km) grid. Results are compared to a longer LGM climatological run with the same boundary conditions and monthly saves. Hourly-scale animations are used to enhance interpretations. The purpose of the study is to explore whether additional insights into ice age conditions can be gleaned by going beyond the standard employment of monthly average model statistics to infer ice age weather and climate. Results for both LGM runs indicate a decrease in North Atlantic and increase in North Pacific cyclogenesis. Storm trajectories react to the mechanical forcing of the Laurentide Ice Sheet, with Pacific storms tracking over middle Alaska and northern Canada, terminating in the Labrador Sea. This result is coincident with other model results in also showing a significant reduction in Greenland wintertime precipitation – a response supported by ice core evidence. Higher-temporal resolution puts in sharper focus the close tracking of Pacific storms along the west coast of North America. This response is consistent with increased poleward heat transport in the LGM climatological run and could help explain “early” glacial warming inferred in this region from proxy climate records. Additional analyses shows a large increase in central Asian surface gustiness that support observational inferences that upper-level winds associated with Asian- Pacific storms transported Asian dust to Greenland during the LGM.
Resumo:
Purpose – The purpose of this paper is to analyse the likelihood of adoption of a recently designed Welfare Assessment System in agri-food supply chains and the factors affecting the adoption decision. The application is carried out for pig and poultry chains. Design/methodology/approach – This research consisted of two main components: interviews with retailers in pig and poultry supply chains in eight different EU countries to explore their perceptions towards the adoption possibilities of the welfare assessment system; and a conjoint analysis designed to evaluate the perceived adoption likelihood of the assessment system by different Standards Formulating Organisations (SFOs). Findings – Stakeholders were found to be especially concerned about the costs of implementation of the system and how it could, or should, be merged with existing assurance schemes. Another conclusion of the study is that the presence of a strong third independent party supporting the implementation of the welfare assessment system would be the most important influence on the decision whether, or not, to adopt it. Originality/value – This research evaluates the adoption possibilities of a novel Welfare Assessment System and presents the views of different supply chain stakeholders on an adoption of such a system. The main factors affecting the adoption decision are identified and analysed. Contrary to expectations, the costs of adoption of a new welfare assessment system were not considered to be the most important factor affecting the decision of supply chain stakeholders about the adoption of this new welfare system.
Resumo:
Statistical methods of inference typically require the likelihood function to be computable in a reasonable amount of time. The class of “likelihood-free” methods termed Approximate Bayesian Computation (ABC) is able to eliminate this requirement, replacing the evaluation of the likelihood with simulation from it. Likelihood-free methods have gained in efficiency and popularity in the past few years, following their integration with Markov Chain Monte Carlo (MCMC) and Sequential Monte Carlo (SMC) in order to better explore the parameter space. They have been applied primarily to estimating the parameters of a given model, but can also be used to compare models. Here we present novel likelihood-free approaches to model comparison, based upon the independent estimation of the evidence of each model under study. Key advantages of these approaches over previous techniques are that they allow the exploitation of MCMC or SMC algorithms for exploring the parameter space, and that they do not require a sampler able to mix between models. We validate the proposed methods using a simple exponential family problem before providing a realistic problem from human population genetics: the comparison of different demographic models based upon genetic data from the Y chromosome.
Resumo:
How fast can a mammal evolve from the size of a mouse to the size of an elephant? Achieving such a large transformation calls for major biological reorganization. Thus, the speed at which this occurs has important implications for extensive faunal changes, including adaptive radiations and recovery from mass extinctions. To quantify the pace of large-scale evolution we developed a metric, clade maximum rate, which represents the maximum evolutionary rate of a trait within a clade. We applied this metric to body mass evolution in mammals over the last 70 million years, during which multiple large evolutionary transitions occurred in oceans and on continents and islands. Our computations suggest that it took a minimum of 1.6, 5.1, and 10 million generations for terrestrial mammal mass to increase 100-, and 1,000-, and 5,000- fold, respectively. Values for whales were down to half the length (i.e., 1.1, 3, and 5 million generations), perhaps due to the reduced mechanical constraints of living in an aquatic environment. When differences in generation time are considered, we find an exponential increase in maximum mammal body mass during the 35 million years following the Cretaceous–Paleogene (K–Pg) extinction event. Our results also indicate a basic asymmetry in macroevolution: very large decreases (such as extreme insular dwarfism) can happen at more than 10 times the rate of increases. Our findings allow more rigorous comparisons of microevolutionary and macroevolutionary patterns and processes. Keywords: haldanes, biological time, scaling, pedomorphosis
Resumo:
Sub-Saharan Africa in general and Ghana in particular, missed out on the Green revolution. Efforts are being made to re-introduce the revolution, and this calls for more socio-economic research into the factors influencing the adoption of new technologies, hence, this study. The study sought to find out how socio-economic factors contribute to adoption of Green revolution technology in Ghana. The method of analysis involved a maximum likelihood estimation of a probit model. The proportion of Green revolution inputs was found to be greater for the following: households whose heads had formal education, households with higher levels of non-farm income, credit and labor supply as well as those living in urban centers. It is recommended that levels of complementary inputs such as credit, extension services and infrastructure are increased. Also, households must be encouraged to form farmer-groups as an important source of farm labor. Furthermore, the fundamental problems of illiteracy must be addressed through increasing the levels of formal and non-formal education; and the gap between the rural and urban centers must be bridged through infrastructural and rural development. However, care must be taken to ensure that small-scale farmers are not marginalized, in terms of access to these complementary inputs that go with effective adoption of new technology. With these policies well implemented, Ghana can catch up with her Asian counterparts in this re-introduction of the revolution.
Resumo:
In Sub-Saharan Africa (SSA) the technological advances of the Green Revolution (GR) have not been very successful. However, the efforts being made to re-introduce the revolution call for more socio-economic research into the adoption and the effects of the new technologies. The paper discusses an investigation on the effects of GR technology adoption on poverty among households in Ghana. Maximum likelihood estimation of a poverty model within the framework of Heckman's two stage method of correcting for sample selection was employed. Technology adoption was found to have positive effects in reducing poverty. Other factors that reduce poverty include education, credit, durable assets, living in the forest belt and in the south of the country. Technology adoption itself was also facilitated by education, credit, non-farm income and household labour supply as well as living in urban centres. Inarguably, technology adoption can be taken seriously by increasing the levels of complementary inputs such as credit, extension services and infrastructure. Above all, the fundamental problems of illiteracy, inequality and lack of effective markets must be addressed through increasing the levels of formal and non-formal education, equitable distribution of the 'national cake' and a more pragmatic management of the ongoing Structural Adjustment Programme.