79 resultados para minimum coverage requirement


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the frequency of extreme events for three LIFFE futures contracts for the calculation of minimum capital risk requirements (MCRRs). We propose a semiparametric approach where the tails are modelled by the Generalized Pareto Distribution and smaller risks are captured by the empirical distribution function. We compare the capital requirements form this approach with those calculated from the unconditional density and from a conditional density - a GARCH(1,1) model. Our primary finding is that both in-sample and for a hold-out sample, our extreme value approach yields superior results than either of the other two models which do not explicitly model the tails of the return distribution. Since the use of these internal models will be permitted under the EC-CAD II, they could be widely adopted in the near future for determining capital adequacies. Hence, close scrutiny of competing models is required to avoid a potentially costly misallocation capital resources while at the same time ensuring the safety of the financial system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recent low and prolonged minimum of the solar cycle, along with the slow growth in activity of the new cycle, has led to suggestions that the Sun is entering a Grand Solar Minimum (GSMi), potentially as deep as the Maunder Minimum (MM). This raises questions about the persistence and predictability of solar activity. We study the autocorrelation functions and predictability R^2_L(t) of solar indices, particularly group sunspot number R_G and heliospheric modulation potential phi for which we have data during the descent into the MM. For R_G and phi, R^2_L (t) > 0.5 for times into the future of t = 4 and 3 solar cycles, respectively: sufficient to allow prediction of a GSMi onset. The lower predictability of sunspot number R_Z is discussed. The current declines in peak and mean R_G are the largest since the onset of the MM and exceed those around 1800 which failed to initiate a GSMi.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The adsorption of carbon monoxide on the Pt{110} surface at coverages of 0.5 ML and 1.0 ML was investigated using quantitative low-energy electron diffraction (LEED IV) and density-functional theory (DFT). At 0.5 ML CO lifts the reconstruction of the clean surface but does not form an ordered overlayer. At the saturation coverage, 1.0 ML, a well-ordered p(2×1) superstructure with glide line symmetry is formed. It was confirmed that the CO molecules adsorb on top of the Pt atoms in the top-most substrate layer with the molecular axes tilted by ±22° with respect to the surface normal in alternating directions away from the close packed rows of Pt atoms. This is accompanied by significant lateral shifts of 0.55 Å away from the atop sites in the same direction as the tilt. The top-most substrate layer relaxes inwards by −4% with respect to the bulk-terminated atom positions, while the consecutive layers only show minor relaxations. Despite the lack of long-range order in the 0.5 ML CO layer it was possible to determine key structural parameters by LEED IV using only the intensities of the integer-order spots. At this coverage CO also adsorbs on atop sites with the molecular axis closer to the surface normal (b10°). The average substrate relaxations in each layer are similar for both coverages and consistent with DFT calculations performed for a variety of ordered structures with coverages of 1.0 ML and 0.5 ML.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Currently, all pharmacists and technicians registered with the Royal Pharmaceutical Society of Great Britain must complete a minimum of nine Continuing Professional Development (CPD) record (entries) each year. From September 2010 a new regulatory body, the General Pharmaceutical Council, will oversee the regulation (including revalidation) of all pharmacy registrants in Great Britain. CPD may provide part of the supporting evidence that a practitioner submits to the regulator as part of the revalidation process. Gaps in knowledge necessitated further research to examine the usefulness of CPD in a pharmacy revalidation Project aims: The overall aims of this project were to summarise pharmacy professionals’ past involvement in CPD, examine the usability of current CPD entries for the purpose of revalidation, and to examine the impact of ‘revalidation standards’ and a bespoke Outcomes Framework on the conduct and construction of CPD entries for future revalidation of pharmacy professionals. We completed a comprehensive review of the literature, devised, validated and tested the impact of a new CPD Outcomes Framework and related training material in an empirical investigation involving volunteer pharmacy professionals and also spoke with our participants to bring meaning and understanding to the process of CPD conduct and recording and to gain feedback on the study itself. Key findings: The comprehensive literature review identified perceived barriers to CPD and resulted in recommendations that could potentially rectify pharmacy professionals’ perceptions and facilitate participation in CPD. The CPD Outcomes Framework can be used to score CPD entries Compared to a control (CPD and ‘revalidation standards’ only), we found that training participants to apply the CPD Outcomes Framework resulted in entries that scored significantly higher in the context of a quantitative method of CPD assessment. Feedback from participants who had received the CPD Outcomes Framework was positive and a number of useful suggestions were made about improvements to the Framework and related training. Entries scored higher because participants had consciously applied concepts linked to the CPD Outcomes Framework whereas entries scored low where participants had been unable to apply the concepts of the Framework for a variety of reasons including limitations posed by the ‘Plan & Record’ template. Feedback about the nature of the ‘revalidation standards’ and their application to CPD was not positive and participants had not in the main sought to apply the standards to their CPD entries – but those in the intervention group were more likely to have referred to the revalidation standards for their CPD. As assessors, we too found the process of selecting and assigning ‘revalidation standards’ to individual CPD entries burdensome and somewhat unspecific. We believe that addressing the perceived barriers and drawing on the facilitators will help deal with the apparent lack of engagement with the revalidation standards and have been able to make a set of relevant recommendations. We devised a model to explain and tell the story of CPD behaviour. Based on the concepts of purpose, action and results, the model centres on explaining two types of CPD behaviour, one following the traditional CE pathway and the other a more genuine CPD pathway. Entries which scored higher when we applied the CPD Outcomes Framework were more likely to follow the CPD pathway in the model above. Significant to our finding is that while participants following both models of practice took part in this study, the CPD Outcomes Framework was able to change people’s CPD behaviour to make it more inline with the CPD pathway. The CPD Outcomes Framework in defining the CPD criteria, the training pack in teaching the basis and use of the Framework and the process of assessment in using the CPD Outcomes Framework, would have interacted to improve participants’ CPD through a collective process. Participants were keen to receive a curriculum against which certainly CE-type activities could be conducted and another important observation relates to whether CE has any role to play in pharmacy professionals’ revalidation. We would recommend that the CPD Outcomes Framework is used in the revalidation of pharmacy professionals in the future provided the requirement to submit 9 CPD entries per annum is re-examined and expressed more clearly in relation to what specifically participants are being asked to submit – i.e. the ratio of CE to CPD entries. We can foresee a benefit in setting more regular intervals which would act as deadlines for CPD submission in the future. On the whole, there is value in using CPD for the purpose of pharmacy professionals’ revalidation in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The IPD Annual Index is the largest and most comprehensive Real Estate market index available in the UK Such coverage however inevitably leads to delays in publication. In contrast there are a number of quarterly and monthly indices which are published within days of the year end but which lack the coverage in terms of size and numbers of properties. This paper analyses these smaller but more timely indices to see whether such indices can be used to predict the performance of the IPD Annual Index. Using a number of measures of forecasting accuracy it is shown that the smaller indices provide unbiased and efficient predictions of the IPD Annual Index. Such indices also significantly outperform a naive no-change model. Although no one index performs significantly better than the others. The more timely indices however do not perfectly track the IPD Annual Index. As a result any short run predictions of performance will be subject to a degree of error. Nevertheless the more timely indices, although lacking authoritative coverage, provide a valuable service to investors giving good estimates of Real Estates performance well before the publication of the IPD Annual Index.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the regularization problem for linear, constant coefficient descriptor systems Ex' = Ax+Bu, y1 = Cx, y2 = Γx' by proportional and derivative mixed output feedback. Necessary and sufficient conditions are given, which guarantee that there exist output feedbacks such that the closed-loop system is regular, has index at most one and E+BGΓ has a desired rank, i.e., there is a desired number of differential and algebraic equations. To resolve the freedom in the choice of the feedback matrices we then discuss how to obtain the desired regularizing feedback of minimum norm and show that this approach leads to useful results in the sense of robustness only if the rank of E is decreased. Numerical procedures are derived to construct the desired feedback gains. These numerical procedures are based on orthogonal matrix transformations which can be implemented in a numerically stable way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Open solar flux (OSF) variations can be described by the imbalance between source and loss terms. We use spacecraft and geomagnetic observations of OSF from 1868 to present and assume the OSF source, S, varies with the observed sunspot number, R. Computing the required fractional OSF loss, χ, reveals a clear solar cycle variation, in approximate phase with R. While peak R varies significantly from cycle to cycle, χ is surprisingly constant in both amplitude and waveform. Comparisons of χ with measures of heliospheric current sheet (HCS) orientation reveal a strong correlation. The cyclic nature of χ is exploited to reconstruct OSF back to the start of sunspot records in 1610. This agrees well with the available spacecraft, geomagnetic, and cosmogenic isotope observations. Assuming S is proportional to R yields near-zero OSF throughout the Maunder Minimum. However, χ becomes negative during periods of low R, particularly the most recent solar minimum, meaning OSF production is underestimated. This is related to continued coronal mass ejection (CME) activity, and therefore OSF production, throughout solar minimum, despite R falling to zero. Correcting S for this produces a better match to the recent solar minimum OSF observations. It also results in a cycling, nonzero OSF during the Maunder Minimum, in agreement with cosmogenic isotope observations. These results suggest that during the Maunder Minimum, HCS tilt cycled as over recent solar cycles, and the CME rate was roughly constant at the levels measured during the most recent two solar minima.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Photoperiodic flowering has been extensively studied in the annual short-day and long-day plants rice and Arabidopsis while less is known about the control of flowering in perennials. In the perennial wild strawberry, Fragaria vesca L. (Rosaceae), short-day and perpetual flowering long-day accessions occur. Genetic analyses showed that differences in their flowering responses are caused by a single gene, the SEASONAL FLOWERING LOCUS which may encode the F. vesca homolog of TERMINAL FLOWER1 (FvTFL1). We show through high-resolution mapping and transgenic approaches that FvTFL1 is the basis of this change in flowering behavior and demonstrate that FvTFL1 acts as a photoperiodically regulated repressor. In short-day F. vesca, long photoperiods activate FvTFL1 mRNA expression and short days suppress it, promoting flower induction. These seasonal cycles in FvTFL1 mRNA level confer seasonal cycling of vegetative and reproductive development. Mutations in FvTFL1 prevent LD suppression of flowering, and the early flowering that then occurs under LD is dependent on the F. vesca homolog of FLOWERING LOCUS T. This photoperiodic response mechanism differs from those described in model annual plants. We suggest that this mechanism controls flowering within the perennial growth cycle in F. vesca, and demonstrate that a change in a single gene reverses the photoperiodic requirements for flowering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the empirical performance of the classical minimum-variance hedging strategy, comparing several econometric models for estimating hedge ratios of crude oil, gasoline and heating oil crack spreads. Given the great variability and large jumps in both spot and futures prices, considerable care is required when processing the relevant data and accounting for the costs of maintaining and re-balancing the hedge position. We find that the variance reduction produced by all models is statistically and economically indistinguishable from the one-for-one “naïve” hedge. However, minimum-variance hedging models, especially those based on GARCH, generate much greater margin and transaction costs than the naïve hedge. Therefore we encourage hedgers to use a naïve hedging strategy on the crack spread bundles now offered by the exchange; this strategy is the cheapest and easiest to implement. Our conclusion contradicts the majority of the existing literature, which favours the implementation of GARCH-based hedging strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The emergence of high-density wireless local area network (WLAN) deployments in recent years is a testament to the insatiable demands for wireless broadband services. The increased density of WLAN deployments brings with it the potential of increased capacity, extended coverage, and exciting new applications. However, the corresponding increase in contention and interference can significantly degrade throughputs, unless new challenges in channel assignment are effectively addressed. In this paper, a client-assisted channel assignment scheme that can provide enhanced throughput is proposed. A study on the impact of interference on throughput with multiple access points (APs)is first undertaken using a novel approach that determines the possibility of parallel transmissions. A metric with a good correlation to the throughput, i.e., the number of conflict pairs, is used in the client-assisted minimum conflict pairs (MICPA) scheme. In this scheme, measurements from clients are used to assist the AP in determining the channel with the minimum number of conflict pairs to maximize its expected throughput. Simulation results show that the client-assisted MICPA scheme can provide meaningful throughput improvements over other schemes that only utilize the AP’s measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We develop a new sparse kernel density estimator using a forward constrained regression framework, within which the nonnegative and summing-to-unity constraints of the mixing weights can easily be satisfied. Our main contribution is to derive a recursive algorithm to select significant kernels one at time based on the minimum integrated square error (MISE) criterion for both the selection of kernels and the estimation of mixing weights. The proposed approach is simple to implement and the associated computational cost is very low. Specifically, the complexity of our algorithm is in the order of the number of training data N, which is much lower than the order of N2 offered by the best existing sparse kernel density estimators. Numerical examples are employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with comparable accuracy to those of the classical Parzen window estimate and other existing sparse kernel density estimators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to improve the quality of healthcare services, the integrated large-scale medical information system is needed to adapt to the changing medical environment. In this paper, we propose a requirement driven architecture of healthcare information system with hierarchical architecture. The system operates through the mapping mechanism between these layers and thus can organize functions dynamically adapting to user’s requirement. Furthermore, we introduce the organizational semiotics methods to capture and analyze user’s requirement through ontology chart and norms. Based on these results, the structure of user’s requirement pattern (URP) is established as the driven factor of our system. Our research makes a contribution to design architecture of healthcare system which can adapt to the changing medical environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Airborne high resolution in situ measurements of a large set of trace gases including ozone (O3) and total water (H2O) in the upper troposphere and the lowermost stratosphere (UT/LMS) have been performed above Europe within the SPURT project. SPURT provides an extensive data coverage of the UT/LMS in each season within the time period between November 2001 and July 2003. In the LMS a distinct spring maximum and autumn minimum is observed in O3, whereas its annual cycle in the UT is shifted by 2–3 months later towards the end of the year. The more variable H2O measurements reveal a maximum during summer and a minimum during autumn/winter with no phase shift between the two atmospheric compartments. For a comprehensive insight into trace gas composition and variability in the UT/LMS several statistical methods are applied using chemical, thermal and dynamical vertical coordinates. In particular, 2-dimensional probability distribution functions serve as a tool to transform localised aircraft data to a more comprehensive view of the probed atmospheric region. It appears that both trace gases, O3 and H2O, reveal the most compact arrangement and are best correlated in the view of potential vorticity (PV) and distance to the local tropopause, indicating an advanced mixing state on these surfaces. Thus, strong gradients of PV seem to act as a transport barrier both in the vertical and the horizontal direction. The alignment of trace gas isopleths reflects the existence of a year-round extra-tropical tropopause transition layer. The SPURT measurements reveal that this layer is mainly affected by stratospheric air during winter/spring and by tropospheric air during autumn/summer. Normalised mixing entropy values for O3 and H2O in the LMS appear to be maximal during spring and summer, respectively, indicating highest variability of these trace gases during the respective seasons.