19 resultados para arbitration proceeding

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Upper air observations from radiosondes and microwave satellite instruments does not indicate any global warming during the last 19 years, contrary to surface measurements, where a warming trend is supposedly being found. This result is somewhat difficult to reconcile, since climate model experiments do indicate a reverse trend, namely, that upper tropospheric air should warm faster than the surface. To contribute toward an understanding of this difficulty, we have here undertaken some specific experiments to study the effect on climate due to the decrease in stratospheric ozone and the Mount Pinatubo eruption in 1991. The associated forcing was added to the forcing from greenhouse gases, sulfate aerosols (direct and indirect effect), and tropospheric ozone, which was investigated in a separate series of experiments. Furthermore, we have undertaken an ensemble study in order to explore the natural variability of an advanced climate model exposed to such a forcing over 19 years. The result shows that the reduction of stratospheric ozone cools not only the lower stratosphere but also the troposphere, in particular, the upper and middle part. In the upper troposphere the cooling from stratospheric ozone leads to a significant reduction of greenhouse warming. The modeled stratospheric aerosols from Mount Pinatubo generate a climate response (stratospheric warming and tropospheric cooling) in good agreement with microwave satellite measurements. Finally, analysis of a series of experiments with both stratospheric ozone and the Mount Pinatubo effect shows considerable variability in climate response, suggesting that an evolution having no warming in the period is as likely as another evolution showing modest warming. However, the observed trend of no warming in the midtroposphere and clear warming at the surface is not found in the model simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a result of the sovereign debt crisis that engulfed Europe in 2010, investors are much more likely to pursue dispute resolution options when faced with losses. This paper seeks to examine the position of investors who suffered losses in the Greek haircut of 2012 in the context of investment treaty arbitration. The paper evaluates arguments that investments in Greek sovereign bonds have been expropriated by the introduction of retrofit CACs and that compensation is payable as a result of the protections offered by BITs. The paper investigates whether sovereign bonds come within the definition of protected investment in BITs, assesses the degree to which CACs act as a jurisdictional bar to investor-state claims and attempts an evaluation of whether claims could be successful. The analysis uses as an illustration recent cases brought against Greece at ICSID. The paper concludes by considering whether the Greek haircut was expropriatory and reflects on the possible outcome of current arbitrations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A parallel hardware random number generator for use with a VLSI genetic algorithm processing device is proposed. The design uses an systolic array of mixed congruential random number generators. The generators are constantly reseeded with the outputs of the proceeding generators to avoid significant biasing of the randomness of the array which would result in longer times for the algorithm to converge to a solution. 1 Introduction In recent years there has been a growing interest in developing hardware genetic algorithm devices [1, 2, 3]. A genetic algorithm (GA) is a stochastic search and optimization technique which attempts to capture the power of natural selection by evolving a population of candidate solutions by a process of selection and reproduction [4]. In keeping with the evolutionary analogy, the solutions are called chromosomes with each chromosome containing a number of genes. Chromosomes are commonly simple binary strings, the bits being the genes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background and Purpose-Clinical research into the treatment of acute stroke is complicated, is costly, and has often been unsuccessful. Developments in imaging technology based on computed tomography and magnetic resonance imaging scans offer opportunities for screening experimental therapies during phase II testing so as to deliver only the most promising interventions to phase III. We discuss the design and the appropriate sample size for phase II studies in stroke based on lesion volume. Methods-Determination of the relation between analyses of lesion volumes and of neurologic outcomes is illustrated using data from placebo trial patients from the Virtual International Stroke Trials Archive. The size of an effect on lesion volume that would lead to a clinically relevant treatment effect in terms of a measure, such as modified Rankin score (mRS), is found. The sample size to detect that magnitude of effect on lesion volume is then calculated. Simulation is used to evaluate different criteria for proceeding from phase II to phase III. Results-The odds ratios for mRS correspond roughly to the square root of odds ratios for lesion volume, implying that for equivalent power specifications, sample sizes based on lesion volumes should be about one fourth of those based on mRS. Relaxation of power requirements, appropriate for phase II, lead to further sample size reductions. For example, a phase III trial comparing a novel treatment with placebo with a total sample size of 1518 patients might be motivated from a phase II trial of 126 patients comparing the same 2 treatment arms. Discussion-Definitive phase III trials in stroke should aim to demonstrate significant effects of treatment on clinical outcomes. However, more direct outcomes such as lesion volume can be useful in phase II for determining whether such phase III trials should be undertaken in the first place. (Stroke. 2009;40:1347-1352.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the latest advances in the area of advanced computer architectures we are seeing already large scale machines at petascale level and we are discussing exascale computing. All these require efficient scalable algorithms in order to bridge the performance gap. In this paper examples of various approaches of designing scalable algorithms for such advanced architectures will be given and the corresponding properties of these algorithms will be outlined and discussed. Examples will outline such scalable algorithms applied to large scale problems in the area Computational Biology, Environmental Modelling etc. The key properties of such advanced and scalable algorithms will be outlined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Progress is reported in the development of a new synthesis method for the design of filters and coatings for use in spaceborne infrared optics. This method uses the Golden Section optimization routine to make a search, using designated dielectric thin film combinations, for the coating design which fulfills the required spectral requirements. The final design is that which uses the least number of layers for the given thin film materials in the starting design. This synthesis method has successfully been used to design broadband anti-reflection coatings on infrared substrates. The 6 micrometers to 18 micrometers anti-reflection coating for the germanium optics of the HIRDLS instrument, to be flown on the NASA EOS-Chem satellite, is given as an example. By correctly defining the target function to describe any specific type of filter in the optimization part of the method, this synthesis method may be used to design general filters for use in spaceborne infrared optics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tissue culture in the oil palm business is generally concerned with the multiplication (clonal production) of dura, pisifera and tenera palms. These are all normal diploids (2n=2x=36). Sumatra Bioscience has pioneered haploid tissue culture of oil palm (n=x=18). Haploid oil palm is the first step in producing doubled haploid palms which in turn provide parental lines for F1 hybrid production. Chromosome doubling is known to occur during embryogenesis in other haploid cultures, e.g. barley anther culture. Haploid tissue cultures in oil palm were therefore set up to investigate and exploit spontaneous chromosome doubling during embryogenesis. Flow cytometry of embryogenic tissue showed the presence of both haploid (n) and doubled haploid (2n) cells indicating spontaneous doubling. Completely doubled haploid ramets were regenerated suggesting that doubling occurred during the first mitoses of embryogenesis. This is the first report of doubled haploid production in oil palm via haploid tissue culture. The method provides a means of producing a range of doubled haploids in oil palm from the 1,000 plus haploids available at Sumatra Bioscience, in addition the method also produced doubled haploid (and haploid) clones. 1.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The organization of non-crystalline polymeric materials at a local level, namely on a spatial scale between a few and 100 a, is still unclear in many respects. The determination of the local structure in terms of the configuration and conformation of the polymer chain and of the packing characteristics of the chain in the bulk material represents a challenging problem. Data from wide-angle diffraction experiments are very difficult to interpret due to the very large amount of information that they carry, that is the large number of correlations present in the diffraction patterns.We describe new approaches that permit a detailed analysis of the complex neutron diffraction patterns characterizing polymer melts and glasses. The coupling of different computer modelling strategies with neutron scattering data over a wide Q range allows the extraction of detailed quantitative information on the structural arrangements of the materials of interest. Proceeding from modelling routes as diverse as force field calculations, single-chain modelling and reverse Monte Carlo, we show the successes and pitfalls of each approach in describing model systems, which illustrate the need to attack the data analysis problem simultaneously from several fronts.