63 resultados para Equity Portfolio with Equal Weights
Resumo:
Background: A full-thickness macular hole (FTMH) is a common retinal condition associated with impaired vision. Randomised controlled trials (RCTs) have demonstrated that surgery, by means of pars plana vitrectomy and post-operative intraocular tamponade with gas, is effective for stage 2, 3 and 4 FTMH. Internal limiting membrane (ILM) peeling has been introduced as an additional surgical manoeuvre to increase the success of the surgery; i.e. increase rates of hole closure and visual improvement. However, little robust evidence exists supporting the superiority of ILM peeling compared with no-peeling techniques. The purpose of FILMS (Full-thickness macular hole and Internal Limiting Membrane peeling Study) is to determine whether ILM peeling improves the visual function, the anatomical closure of FTMH, and the quality of life of patients affected by this disorder, and the cost-effectiveness of the surgery. Methods/Design: Patients with stage 2-3 idiopathic FTMH of less or equal than 18 months duration (based on symptoms reported by the participant) and with a visual acuity = 20/40 in the study eye will be enrolled in this FILMS from eight sites across the UK and Ireland. Participants will be randomised to receive combined cataract surgery (phacoemulsification and intraocular lens implantation) and pars plana vitrectomy with postoperative intraocular tamponade with gas, with or without ILM peeling. The primary outcome is distance visual acuity at 6 months. Secondary outcomes include distance visual acuity at 3 and 24 months, near visual acuity at 3, 6, and 24 months, contrast sensitivity at 6 months, reading speed at 6 months, anatomical closure of the macular hole at each time point (1, 3, 6, and 24 months), health related quality of life (HRQOL) at six months, costs to the health service and the participant, incremental costs per quality adjusted life year (QALY) and adverse events. Discussion: FILMS will provide high quality evidence onthe role of ILM peeling in FTMH surgery. © 2008 Lois et al; licensee BioMed Central Ltd.
Resumo:
A six-year prospective study of 144 newly diagnosed, symptomatic diabetic patients aged 40-69 years showed that 21 (15%) required insulin therapy, commencing 1-61 months after diagnosis. The plasma insulin response to oral glucose was assessed at the time of diagnosis. All 12 patients with very low peak insulin response (less than or equal to 6 mU/l) required insulin therapy. Thirty-six patients had an intermediate insulin response (greater than 6 less than or equal to 18 mU/l); of these, 7 with a mean weight 88% (range 73-96%) of average body weight required insulin, while 29 with a mean weight 117% (range 98-158%) of average body weight, did not. Ninety-six patients had a peak insulin response (greater than 18 mU/l); 2 patients whose weights were 96% and 100% of average body weight, required insulin, while the remainder did not. Consideration of initial body weight and peak insulin response provides a useful prediction of the eventual need for insulin.
Resumo:
The scale of BT's operations necessitates the use of very large scale computing systems, and the storage and management of large volumes of data. Customer product portfolios are an important form of data which can be difficult to store in a space efficient way. The difficulties arise from the inherently structured form of product portfolios, and the fact that they change over time as customers add or remove products. This paper introduces a new data-modelling abstraction called the List_Tree. It has been designed specifically to support the efficient storage and manipulation of customer product portfolios, but may also prove useful in other applications with similar general requirements.
Resumo:
In this article, we aim to consider equity’s responses to gifts in a new way. We begin by setting out an account of human values that are associated with donative practices and that lend value to gifts themselves. With this map of the values associated with gifts in view, we then turn to consider some equitable responses to gifts, arranged roughly on a spectrum in accordance with the measure of scepticism towards gifts that they might, at first glance, seem to entail. We discuss, in turn: (a) equity’s treatment of imperfect gifts; (b) equity’s treatment of promises to give; (c) the position in equity of donee recipients of misapplied trust assets; (d) the presumptions of resulting trust and (e) advancement; and (f) equity’s treatment of mistaken gifts. With respect to each type of case, we evaluate equity’s response to gifts in light of the range of human values associated with gifts. We conclude by examining some broad themes that emerge from this analysis, and in particular the extent to which equity might achieve a greater accommodation of donative values consistent with the demands of the rule of law.
Resumo:
Dissolved Air Flotation (DAF) is a well-known coagulation-flotation system applied at large scale for microalgae harvesting. Compared to conventional harvesting technologies DAF allows high cell recovery at lower energy demand. By replacing microbubbles with microspheres, the innovative Ballasted Dissolved Air Flotation (BDAF) technique has been reported to achieve the same algae cell removal efficiency, while saving up to 80% of the energy required for the conventional DAF unit. Using three different algae cultures (Scenedesmus obliquus, Chlorella vulgaris and Arthrospira maxima), the present work investigated the practical, economic and environmental advantages of the BDAF system compared to the DAF system. 99% cells separation was achieved with both systems, nevertheless, the BDAF technology allowed up to 95% coagulant reduction depending on the algae species and the pH conditions adopted. In terms of floc structure and strength, the inclusion of microspheres in the algae floc generated a looser aggregate, showing a more compact structure within single cell alga, than large and filamentous cells. Overall, BDAF appeared to be a more reliable and sustainable harvesting system than DAF, as it allowed equal cells recovery reducing energy inputs, coagulant demand and carbon emissions. © 2014 Elsevier Ltd.
Resumo:
In this paper we investigate the first and second order characteristics of the received signal at the output ofhypothetical selection, equal gain and maximal ratio combiners which utilize spatially separated antennas at the basestation. Considering a range of human body movements, we model the model the small-scale fading characteristics ofthe signal using diversity specific analytical equations which take into account the number of available signal branchesat the receiver. It is shown that these equations provide an excellent fit to the measured channel data. Furthermore, formany hypothetical diversity receiver configurations, the Nakagami-m parameter was found to be close to 1.
Resumo:
We consider the uplink of massive multicell multiple-input multiple-output systems, where the base stations (BSs), equipped with massive arrays, serve simultaneously several terminals in the same frequency band. We assume that the BS estimates the channel from uplink training, and then uses the maximum ratio combining technique to detect the signals transmitted from all terminals in its own cell. We propose an optimal resource allocation scheme which jointly selects the training duration, training signal power, and data signal power in order to maximize the sum spectral efficiency, for a given total energy budget spent in a coherence interval. Numerical results verify the benefits of the optimal resource allocation scheme. Furthermore, we show that more training signal power should be used at low signal-to-noise ratio (SNRs), and vice versa at high SNRs. Interestingly, for the entire SNR regime, the optimal training duration is equal to the number of terminals.
Resumo:
We consider a multipair decode-and-forward relay channel, where multiple sources transmit simultaneously their signals to multiple destinations with the help of a full-duplex relay station. We assume that the relay station is equipped with massive arrays, while all sources and destinations have a single antenna. The relay station uses channel estimates obtained from received pilots and zero-forcing (ZF) or maximum-ratio combining/maximum-ratio transmission (MRC/MRT) to process the signals. To reduce significantly the loop interference effect, we propose two techniques: i) using a massive receive antenna array; or ii) using a massive transmit antenna array together with very low transmit power at the relay station. We derive an exact achievable rate in closed-form for MRC/MRT processing and an analytical approximation of the achievable rate for ZF processing. This approximation is very tight, especially for large number of relay station antennas. These closed-form expressions enable us to determine the regions where the full-duplex mode outperforms the half-duplex mode, as well as, to design an optimal power allocation scheme. This optimal power allocation scheme aims to maximize the energy efficiency for a given sum spectral efficiency and under peak power constraints at the relay station and sources. Numerical results verify the effectiveness of the optimal power allocation scheme. Furthermore, we show that, by doubling the number of transmit/receive antennas at the relay station, the transmit power of each source and of the relay station can be reduced by 1.5dB if the pilot power is equal to the signal power, and by 3dB if the pilot power is kept fixed, while maintaining a given quality-of-service.
Resumo:
This article in one of the leading German journals on labour law analyses the shortcomings of German labour law at the time (2004) in relation to the EU non-discrimination directives. It states that the reluctance to legislate against race, sex and disability discrimination must be overcome, if the demands of the directives are to be fulfilled. It also explains how those forms of discrimination could already be addressed by interpreting German labour law in line with those directives and constitutional requirements. Only in 2006 was the relevant legislation finally passed (three years later than required).
Resumo:
Traditionally, audio-motor timing processes have been understood as motor output from an internal clock, the speed of which is set by heard sound pulses. In contrast, this paper proposes a more ecologically-grounded approach, arguing that audio-motor processes are better characterized as performed actions on the perceived structure of auditory events. This position is explored in the context of auditory sensorimotor synchronization and continuation timing. Empirical research shows that the structure of sounds as auditory events can lead to marked differences in movement timing performance. The nature of these effects is discussed in the context of perceived action-relevance of auditory event structure. It is proposed that different forms of sound invite or support different patterns of sensorimotor timing. Hence, the temporal information in looped auditory signals is more than just the interval durations between onsets: all metronomes are not created equal. The potential implications for auditory guides in motor performance enhancement are also described.
Resumo:
This paper considers inference from multinomial data and addresses the problem of choosing the strength of the Dirichlet prior under a mean-squared error criterion. We compare the Maxi-mum Likelihood Estimator (MLE) and the most commonly used Bayesian estimators obtained by assuming a prior Dirichlet distribution with non-informative prior parameters, that is, the parameters of the Dirichlet are equal and altogether sum up to the so called strength of the prior. Under this criterion, MLE becomes more preferable than the Bayesian estimators at the increase of the number of categories k of the multinomial, because non-informative Bayesian estimators induce a region where they are dominant that quickly shrinks with the increase of k. This can be avoided if the strength of the prior is not kept constant but decreased with the number of categories. We argue that the strength should decrease at least k times faster than usual estimators do.
Resumo:
Background: Ivacaftor has shown a clinical benefit in patients with cystic fibrosis who have the G551D-CFTR mutation and reduced lung function. Lung clearance index (LCI) using multiple-breath washout might be an alternative to and more sensitive method than forced expiratory volume in 1 s (FEV1) to assess treatment response in the growing number of children and young adults with cystic fibrosis who have normal spirometry. The aim of the study was to assess the treatment effects of ivacaftor on LCI in patients with cystic fibrosis, a G551D-CFTR mutation, and an FEV1 >90% predicted. Methods: This phase 2, multicentre, placebo-controlled, double-blind 2×2 crossover study of ivacaftor treatment was conducted in patients with cystic fibrosis, at least one G551D-CFTR allele, and an FEV1 >90% predicted. Patients also had to have an LCI higher than 7·4 at screening, age of 6 years or older, and a weight higher than or equal to 15 kg. Eligible patients were randomly allocated to receive one of two treatment sequences (placebo first followed by ivacaftor 150 mg twice daily [sequence 1] or ivacaftor 150 mg twice daily first followed by placebo [sequence 2]) of 28 days' treatment in each period, with a 28-day washout between the two treatment periods. Randomisation (ratio 1:1) was done with block sizes of 4, and all site personnel including the investigator, the study monitor, and the Vertex study team were masked to treatment assignment. The primary outcome measure was change from baseline in LCI. The study is registered at ClinicalTrials.gov, NCT01262352. Findings: Between February and November, 2011, 21 patients were enrolled, of which 11 were assigned to the sequence 1 group, and 10 to the sequence 2 group. 20 of these patients received treatment and 17 completed the trial (eight in sequence 1 group and 9 in sequence 2 group). Treatment with ivacaftor led to significant improvements compared with placebo in LCI (difference between groups in the average of mean changes from baseline at days 15 and 29 was -2·16 [95% CI -2·88 to -1·44]; p<0·0001). Adverse events experienced by study participants were similar between treatment groups; at least one adverse event was reported by 15 (79%) of 19 patients who received placebo and 13 (72%) of 18 patients who received ivacaftor. No deaths occurred during study period. Interpretation: In patients with cystic fibrosis aged 6 years or older who have at least one G551D-CFTR allele, ivacaftor led to improvements in LCI. LCI might be a more sensitive alternative to FEV1 in detecting response to intervention in these patients with mild lung disease. Funding: Vertex Pharmaceuticals Incorporated. © 2013 Elsevier Ltd.
Resumo:
The power system of the future will have a hierarchical structure created by layers of system control from via regional high-voltage transmission through to medium and low-voltage distribution. Each level will have generation sources such as large-scale offshore wind, wave, solar thermal, nuclear directly connected to this Supergrid and high levels of embedded generation, connected to the medium-voltage distribution system. It is expected that the fuel portfolio will be dominated by offshore wind in Northern Europe and PV in Southern Europe. The strategies required to manage the coordination of supply-side variability with demand-side variability will include large scale interconnection, demand side management, load aggregation and storage in the concept of the Supergrid combined with the Smart Grid. The design challenge associated with this will not only include control topology, data acquisition, analysis and communications technologies, but also the selection of fuel portfolio at a macro level. This paper quantifies the amount of demand side management, storage and so-called ‘back-up generation’ needed to support an 80% renewable energy portfolio in Europe by 2050.
Resumo:
We present Hubble Space Telescope (HST) rest-frame ultraviolet imaging of the host galaxies of 16 hydrogen-poor superluminous supernovae (SLSNe), including 11 events from the Pan-STARRS Medium Deep Survey. Taking advantage of the superb angular resolution of HST, we characterize the galaxies' morphological properties, sizes, and star formation rate (SFR) densities. We determine the supernova (SN) locations within the host galaxies through precise astrometric matching and measure physical and host-normalized offsets as well as the SN positions within the cumulative distribution of UV light pixel brightness. We find that the host galaxies of H-poor SLSNe are irregular, compact dwarf galaxies, with a median half-light radius of just 0.9 kpc. The UV-derived SFR densities are high ([Sigma(SFR)] similar or equal to 0.1M(circle dot) yr(-1) kpc(-1)), suggesting that SLSNe form in overdense environments. Their locations trace the UV light of their host galaxies, with a distribution intermediate between that of long-duration gamma-ray bursts (LGRBs; which are strongly clustered on the brightest regions of their hosts) and a uniform distribution (characteristic of normal core-collapse SNe), though cannot be statistically distinguished from either with the current sample size. Taken together, this strengthens the picture that SLSN progenitors require different conditions than those of ordinary core-collapse SNe to form and that they explode in broadly similar galaxies as do LGRBs. If the tendency for SLSNe to be less clustered on the brightest regions than are LGRBs is confirmed by a larger sample, this would indicate a different, potentially lower-mass progenitor for SLSNe than LRGBs.
Resumo:
Jellyfish are highly topical within studies of pelagic food-webs and there is a growing realisation that their role is more complex than once thought. Efforts being made to include jellyfish within fisheries and ecosystem models are an important step forward, but our present understanding of their underlying trophic ecology can lead to their oversimplification in these models. Gelatinous zooplankton represent a polyphyletic assemblage spanning >2,000 species that inhabit coastal seas to the deep-ocean and employ a wide variety of foraging strategies. Despite this diversity, many contemporary modelling approaches include jellyfish as a single functional group feeding at one or two trophic levels at most. Recent reviews have drawn attention to this issue and highlighted the need for improved communication between biologists and theoreticians if this problem is to be overcome. We used stable isotopes to investigate the trophic ecology of three co-occurring scyphozoan jellyfish species (Aurelia aurita, Cyanea lamarckii and C. capillata) within a temperate, coastal food-web in the NE Atlantic. Using information on individual size, time of year and ;delta C-13 and delta N-15 stable isotope values, we examined: (1) whether all jellyfish could be considered as a single functional group, or showed distinct inter-specific differences in trophic ecology; (2) Were size-based shifts in trophic position, found previously in A. aurita, a common trait across species?; (3) When considered collectively, did the trophic position of three sympatric species remain constant over time? Differences in delta N-15 (trophic position) were evident between all three species, with size-based and temporal shifts in delta N-15 apparent in A. aurita and C. capillata. The isotopic niche width for all species combined increased throughout the season, reflecting temporal shifts in trophic position and seasonal succession in these gelatinous species. Taken together, these findings support previous assertions that jellyfish require more robust inclusion in marine fisheries or ecosystem models.