864 resultados para Variational Convergence
Resumo:
Discusses the amendments to the Polish Competition Act 2007 adopted in June 2014 which aim to enhance the effectiveness of antitrust enforcement, including the introduction of: (1) civil fines for individuals; (2) a "leniency plus" programme based on the US model; (3) a settlement procedure; and (4) extended inspection powers for the Competition Authority. Assesses the likely effectiveness of the reforms.
Resumo:
The momentum term has long been used in machine learning algorithms, especially back-propagation, to improve their speed of convergence. In this paper, we derive an expression to prove the O(1/k2) convergence rate of the online gradient method, with momentum type updates, when the individual gradients are constrained by a growth condition. We then apply these type of updates to video background modelling by using it in the update equations of the Region-based Mixture of Gaussians algorithm. Extensive evaluations are performed on both simulated data, as well as challenging real world scenarios with dynamic backgrounds, to show that these regularised updates help the mixtures converge faster than the conventional approach and consequently improve the algorithm’s performance.
Resumo:
Yttrium triflate or triflic acid catalysed Povarov reaction of methyl anthranilate with ethyl vinyl ether, both as aldehyde surrogate and as alkene, gave the desired 2-methyl-4-ethoxytetrahydroquinoline diastereoisomers as the major products along with four component coupling von Miller adducts. A proton NMR-study, using yttrium triflate as catalyst, revealed that the cis-diastereoisomers were the initial major products in both the Povarov and von Miller reactions but that these isomerised to the trans-diastereoisomers under the reaction conditions. Two distinct pathways for forming von Miller adducts were uncovered with the initial Povarov products being converted to von Miller adducts under the reaction conditions. Replacement of the 4-ethoxy with a 4-methoxy group under acidic conditions gave predominantly the trans-diastereoisomer, which was subsequently converted to a cis/trans mixture of the tetrahydroquinoline antibiotic helquinoline. It was also possible to convert the von Miller products to Povarov products under acidic conditions
Resumo:
LivingTV's flagship series, Most Haunted, has been haunting the satellite network since 2002. The set-up of the series is straightforward: a team of investigators, including a historian, a parapsychologist, and "spiritualist medium" Derek Acorah, "legend-trip," spending the night at some location within the United Kingdom that is reputed to be haunted, with the hopes of catching on video concrete proof of the existence of ghosts. However, unlike other reality television or true-life supernatural television shows, Most Haunted includes and addresses the audience less as a spectator and more as an active participant in the ghost hunt. Watching Most Haunted, we are directed not so much to accept or reject the evidence provided, as to engage in the debate over the evidence's veracity. Like legend-telling in its oral form, belief in or rejection of the truth-claims of the story are less central than the possibility of the narrative's truth - a position that invites debates about those truth-claims. This paper argues that Most Haunted, in its premise and structure, not only depicts or represents legend texts (here ghost stories), but engages the audience in the debates about the status of its truth-claims, thereby bringing this mass-mediated popular culture text closer to the folkloristic, legend-telling dynamic than other similar shows.
Resumo:
The I/Q mismatches in quadrature radio receivers results in finite and usually insufficient image rejection, degrading the performance greatly. In this paper we present a detailed analysis of the Blind-Source Separation (BSS) based mismatch corrector in terms of its structure, convergence and performance. The results indicate that the mismatch can be effectively compensated during the normal operation as well as in the rapidly changing environments. Since the compensation is carried out before any modulation specific processing, the proposed method works with all standard modulation formats and is amenable to low-power implementations.
Resumo:
Cappadocian Greek is reported to display agglutinative inflection in its nominal system, namely, mono-exponential formatives for the marking of case and number, and NOM.SG-looking forms as the morphemic units to which inflection applies. Previous scholarship has interpreted these developments as indicating a shift in morphological type from fusion to agglutination, brought about by contact with Turkish. This study takes issue with these conclusions. By casting a wider net over the inflectional system of the language, it shows that, of the two types of agglutinative formations identified, only one evidences a radical departure from the inherited structural properties of Cappadocian noun inflection. The other, on the contrary, represents a typologically more conservative innovation. The study presents evidence that a combination of system-internal and -external motivations triggered the development of both types, it describes the mechanisms through which the innovation was implemented, and discusses the factors that favoured change.
Resumo:
This research examines media integration in China, choosing two Chinese newspaper groups as cases for comparative study. The study analyses the convergence strategies of these Chinese groups by reference to an Role Model of convergence developed from a literature review of studies of cases of media convergence in the UK – in particular the Guardian (GNM), Telegraph Media Group (TMG), the Daily Mail and the Times. UK cases serve to establish the characteristics, causes and consequences of different forms of convergence and formulate a model of convergence. The model will specify the levels of newsroom convergence and the sub-units of analysis which will be used to collect empirical data from Chinese News Organisations and compare their strategies, practices and results with the UK experience. The literature review shows that there is a need for more comparative studies of media convergence strategy in general, and particularly in relation to Chinese media. Therefore, the study will address a gap in the understanding of media convergence in China. For this reason, my innovations have three folds: Firstly, to develop a new and comprehensive model of media convergence and a detailed understanding of the reasons why media companies pursue differing strategies in managing convergence across a wide range of units of analysis. Secondly, this study tries to compare the multimedia strategies of media groups under radically different political systems. Since, there is no standard research method or systematic theoretical framework for the study of Newsroom Convergence, this study develops an integrated perspective. The research will use the triangulation analysis of textual, field observation and interviews to explain systematically what was the newsroom structure like in the past and how did the copy flow change and why. Finally, this case study of media groups can provide an industrial model or framework for the other media groups.
Resumo:
The evolution of the electrical grid into a smart grid, allowing user production, storage and exchange of energy, remote control of appliances, and in general optimizations over how the energy is managed and consumed, is also an evolution into a complex Information and Communication Technology (ICT) system. With the goal of promoting an integrated and interoperable smart grid, a number of organizations all over the world started uncoordinated standardization activities, which caused the emergence of a large number of incompatible architectures and standards. There are now new standardization activities which have the goal of organizing existing standards and produce best practices to choose the right approach(es) to be employed in specific smart grid designs. This paper follows the lead of NIST and ETSI/CEN/CENELEC approaches in trying to provide taxonomy of existing solutions; our contribution reviews and relates current ICT state-of-the-art, with the objective of forecasting future trends based on the orientation of current efforts and on relationships between them. The resulting taxonomy provides guidelines for further studies of the architectures, and highlights how the standards in the last mile of the smart grid are converging to common solutions to improve ICT infrastructure interoperability.
Resumo:
The intrinsic forces of market aiming for telecom industry convergence has arrived to Brazil. This case presents real characters, a sequence of events and other public information that has been impacting two corporations studied in this case. TIM Brazil and Oi S.A, two top players in the Brazilian telecom industry mobile and fixed segment respectively. While a merge between the two of them looks perfect and simple in an operational perspective due to its vertical complementarity, bring to them opportunities to win over a bundle offer (multi service package) that will consolidate their market predominance. Macroeconomic and internal corporate contrasts between these companies’ environment might signal that an impulsive could have a high price to pay in the future.
Resumo:
The real convergence hypothesis has spurred a myriad of empirical tests and approaches in the economic literature. This Work Project intends to test for real output and growth convergence in all N(N-1)/2 possible pairs of output and output growth gaps of 14 Eurozone countries. This paper follows a time-series approach, as it tests for the presence of unit roots and persistence changes in the above mentioned pairs of output gaps, as well as for the existence of growth convergence with autoregressive models. Overall, significantly greater evidence has been found to support growth convergence rather than output convergence in our sample.
Resumo:
Rubisco is responsible for the fixation of CO2 into organic compounds through photosynthesis and thus has a great agronomic importance. It is well established that this enzyme suffers from a slow catalysis, and its low specificity results into photorespiration, which is considered as an energy waste for the plant. However, natural variations exist, and some Rubisco lineages, such as in C4 plants, exhibit higher catalytic efficiencies coupled to lower specificities. These C4 kinetics could have evolved as an adaptation to the higher CO2 concentration present in C4 photosynthetic cells. In this study, using phylogenetic analyses on a large data set of C3 and C4 monocots, we showed that the rbcL gene, which encodes the large subunit of Rubisco, evolved under positive selection in independent C4 lineages. This confirms that selective pressures on Rubisco have been switched in C4 plants by the high CO2 environment prevailing in their photosynthetic cells. Eight rbcL codons evolving under positive selection in C4 clades were involved in parallel changes among the 23 independent monocot C4 lineages included in this study. These amino acids are potentially responsible for the C4 kinetics, and their identification opens new roads for human-directed Rubisco engineering. The introgression of C4-like high-efficiency Rubisco would strongly enhance C3 crop yields in the future CO2-enriched atmosphere.
Resumo:
To what extent should public utilities regulation be expected to converge across countries? When it occurs, will it generate good outcomes? Building on the core proposition of the New Institutional Economics that similar regulations generate different outcomes depending on their fit with the underlying domestic institutions, we develop a simple model and explore its implications by examining the diffusion of local loop unbundling (LLU) regulations. We argue that: one should expect some convergence in public utility regulation but with still a significant degree of local experimentation; this process will have very different impacts of regulation.
Resumo:
All-electron partitioning of wave functions into products ^core^vai of core and valence parts in orbital space results in the loss of core-valence antisymmetry, uncorrelation of motion of core and valence electrons, and core-valence overlap. These effects are studied with the variational Monte Carlo method using appropriately designed wave functions for the first-row atoms and positive ions. It is shown that the loss of antisymmetry with respect to interchange of core and valence electrons is a dominant effect which increases rapidly through the row, while the effect of core-valence uncorrelation is generally smaller. Orthogonality of the core and valence parts partially substitutes the exclusion principle and is absolutely necessary for meaningful calculations with partitioned wave functions. Core-valence overlap may lead to nonsensical values of the total energy. It has been found that even relatively crude core-valence partitioned wave functions generally can estimate ionization potentials with better accuracy than that of the traditional, non-partitioned ones, provided that they achieve maximum separation (independence) of core and valence shells accompanied by high internal flexibility of ^core and Wvai- Our best core-valence partitioned wave function of that kind estimates the IP's with an accuracy comparable to the most accurate theoretical determinations in the literature.
Resumo:
Optimization of wave functions in quantum Monte Carlo is a difficult task because the statistical uncertainty inherent to the technique makes the absolute determination of the global minimum difficult. To optimize these wave functions we generate a large number of possible minima using many independently generated Monte Carlo ensembles and perform a conjugate gradient optimization. Then we construct histograms of the resulting nominally optimal parameter sets and "filter" them to identify which parameter sets "go together" to generate a local minimum. We follow with correlated-sampling verification runs to find the global minimum. We illustrate this technique for variance and variational energy optimization for a variety of wave functions for small systellls. For such optimized wave functions we calculate the variational energy and variance as well as various non-differential properties. The optimizations are either on par with or superior to determinations in the literature. Furthermore, we show that this technique is sufficiently robust that for molecules one may determine the optimal geometry at tIle same time as one optimizes the variational energy.
Resumo:
A new approach to treating large Z systems by quantum Monte Carlo has been developed. It naturally leads to notion of the 'valence energy'. Possibilities of the new approach has been explored by optimizing the wave function for CuH and Cu and computing dissociation energy and dipole moment of CuH using variational Monte Carlo. The dissociation energy obtained is about 40% smaller than the experimental value; the method is comparable with SCF and simple pseudopotential calculations. The dipole moment differs from the best theoretical estimate by about 50% what is again comparable with other methods (Complete Active Space SCF and pseudopotential methods).