30 resultados para Ascertainment of demand
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
Background: TORCH (Towards a Revolution in COPD Health) is an international multicentre, randomised, placebo-controlled clinical trial of inhaled fluticasone propionate/salmeterol combination treatment and its monotherapy components for maintenance treatment of moderately to severely impaired patients with chronic obstructive pulmonary disease (COPD). The primary outcome is all-cause mortality. Cause-specific mortality and deaths related to COPD are additional outcome measures, but systematic methods for ascertainment of these outcomes have not previously been described. Methods: A Clinical Endpoint Committee (CEC) was tasked with categorising the cause of death and the relationship of deaths to COPD in a systematic, unbiased and independent manner. The key elements of the operation of the committee were the use of predefined principles of operation and definitions of cause of death and COPD-relatedness; the independent review of cases by all members with development of a consensus opinion; and a substantial infrastructure to collect medical information. Results: 911 deaths were reviewed and consensus was reached in all. Cause-specific mortality was: cardiovascular 27%, respiratory 35%, cancer 21%, other 10% and unknown 8%. 40% of deaths were definitely or probably related to COPD. Adjudications were identical in 83% of blindly re-adjudicated cases ( = 0.80). COPD-relatedness was reproduced 84% of the time ( = 0.73). The CEC adjudication was equivalent to the primary cause of death recorded by the site investigator in 52% of cases. Conclusion: A CEC can provide standardised, reliable and informative adjudication of COPD mortality that provides information which frequently differs from data collected from assessment by site investigators.
Resumo:
In this paper we present an empirical analysis of the residential demand for electricity using annual aggregate data at the state level for 48 US states from 1995 to 2007. Earlier literature has examined residential energy consumption at the state level using annual or monthly data, focusing on the variation in price elasticities of demand across states or regions, but has failed to recognize or address two major issues. The first is that, when fitting dynamic panel models, the lagged consumption term in the right-hand side of the demand equation is endogenous. This has resulted in potentially inconsistent estimates of the long-run price elasticity of demand. The second is that energy price is likely mismeasured.
Resumo:
Dwindling fossil fuel resources and pressures to reduce greenhouse gas (GHG) emissions will result in a more diverse range of generation portfolios for future electricity systems. Irrespective of the portfolio mix the overarching requirement for all electricity suppliers and system operators is that supply instantaneously meets demand and that robust operating standards are maintained to ensure a consistent supply of high quality electricity to end-users. Therefore all electricity market participants will ultimately need to use a variety of tools to balance the power system. Thus the role of demand side management (DSM) with energy storage will be paramount to integrate future diverse generation portfolios. Electric water heating (EWH) has been studied previously, particularly at the domestic level to provide load control, peak shave and to benefit end-users financially with lower bills, particularly in vertically integrated monopolies. In this paper, a continuous Direct Load Control (DLC) EWH algorithm is applied in a liberalized market environment using actual historical electricity system and market data to examine the potential energy savings, cost reductions and electricity system operational improvements.
Resumo:
The decarbonisation of energy systems draw a new set of stakeholders into debates over energy generation, engage a complex set of social, political, economic and environmental processes and impact at a wide range of geographical scales, including local landscape changes, national energy markets and regional infrastructure investment. This paper focusses on a particular geographic scale, that of the regions/nations of the UK (Scotland, Wales, Northern Ireland), who have been operating under devolved arrangements since the late 1990s, coinciding with the mass deployment of wind energy. The devolved administrations of the UK possess an asymmetrical set of competencies over energy policy, yet also host the majority of the UK wind resource. This context provides a useful way to consider the different ways in which geographies of "territory" are reflected in energy governance, such through techno-rational assessments of demand or infrastructure investment, but also through new spatially-defined institutions that seek to develop their own energy future, using limited regulatory competencies. By focussing on the way the devolved administrations have used their responsibilities for planning over the last decade this paper will assess the way in which the spatial politics of wind energy is giving rise to renewed forms of territorialisation of natural resources. In so doing, we aim to contribute to clarifying the questions raised by Hodson and Marvin (2013) on whether low carbon futures will reinforce or challenge dominant ways of organising relationships between the nation-state, regions, energy systems and the environment.
Resumo:
The predominant fear in capital markets is that of a price spike. Commodity markets differ in that there is a fear of both upward and down jumps, this results in implied volatility curves displaying distinct shapes when compared to equity markets. The use of a novel functional data analysis (FDA) approach, provides a framework to produce and interpret functional objects that characterise the underlying dynamics of oil future options. We use the FDA framework to examine implied volatility, jump risk, and pricing dynamics within crude oil markets. Examining a WTI crude oil sample for the 2007–2013 period, which includes the global financial crisis and the Arab Spring, strong evidence is found of converse jump dynamics during periods of demand and supply side weakness. This is used as a basis for an FDA-derived Merton (1976) jump diffusion optimised delta hedging strategy, which exhibits superior portfolio management results over traditional methods.
Resumo:
The power system of the future will have a hierarchical structure created by layers of system control from via regional high-voltage transmission through to medium and low-voltage distribution. Each level will have generation sources such as large-scale offshore wind, wave, solar thermal, nuclear directly connected to this Supergrid and high levels of embedded generation, connected to the medium-voltage distribution system. It is expected that the fuel portfolio will be dominated by offshore wind in Northern Europe and PV in Southern Europe. The strategies required to manage the coordination of supply-side variability with demand-side variability will include large scale interconnection, demand side management, load aggregation and storage in the concept of the Supergrid combined with the Smart Grid. The design challenge associated with this will not only include control topology, data acquisition, analysis and communications technologies, but also the selection of fuel portfolio at a macro level. This paper quantifies the amount of demand side management, storage and so-called ‘back-up generation’ needed to support an 80% renewable energy portfolio in Europe by 2050.
Resumo:
Universities planning the provision of space for their teaching requirements need to do so in a fashion that reduces capital and maintenance costs whilst still providing a high-quality level of service. Space plans should aim to provide sufficient capacity without incurring excessive costs due to over-capacity. A simple measure used to estimate over-provision is utilisation. Essentially, the utilisation is the fraction of seats that are used in practice, or the ratio of demand to supply. However, studies usually find that utilisation is low, often only 20–40%, and this is suggestive of significant over-capacity.
Our previous work has provided methods to improve such space planning. They identify a critical level of utilisation as the highest level that can be achieved whilst still reliably satisfying the demand for places to allocate teaching events. In this paper, we extend this body of work to incorporate the notions of event-types and space-types. Teaching events have multiple ‘event-types’, such as lecture, tutorial, workshop, etc., and there are generally corresponding space-types. Matching the type of an event to a room of a corresponding space-type is generally desirable. However, realistically, allocation happens in a mixed space-type environment where teaching events of a given type are allocated to rooms of another space-type; e.g., tutorials will borrow lecture theatres or workshop rooms.
We propose a model and methodology to quantify the effects of space-type mixing and establish methods to search for better space-type profiles; where the term “space-type profile” refers to the relative numbers of each type of space. We give evidence that these methods have the potential to improve utilisation levels. Hence, the contribution of this paper is twofold. Firstly, we present informative studies of the effects of space-type mixing on utilisation, and critical utilisations. Secondly, we present straightforward though novel methods to determine better space-type profiles, and give an example in which the resulting profiles are indeed significantly improved.
Resumo:
Dwindling fossil fuel resources and pressures to reduce greenhouse gas emissions will result in a more diverse range of generation portfolios for future electricity systems. Irrespective of the portfolio mix the overarching requirement for all electricity suppliers and system operators is to instantaneously meet demand, to operate to standards and reduce greenhouse gas emissions. Therefore all electricity market participants will ultimately need to use a variety of tools to balance the power system. Thus the role of demand side management with energy storage will be paramount to integrate future diverse generation portfolios. Electric water heating has been studied previously, particularly at the domestic level to provide load control, peak shave and to bene?t end-users ?nancially with lower bills, particularly in vertically integrated monopolies. In this paper a number of continuous direct load control demand response based electric water heating algorithms are modelled to test the effectiveness of wholesale electricity market signals to study the system bene?ts. The results are compared and contrasted to determine which control algorithm showed the best potential for energy savings, system marginal price savings and wind integration.
Resumo:
The frequency of bad harvests and price elasticity of demand are measured using new data on English grain yields 1268–1480 and 1750–1850 and a revised price series. The analysis shows that major harvest shortfalls were a significant component of most historical subsistence crises, as back-to-back shortfalls were of the worst famines. Although serious harvest shortfalls long remained an unavoidable fact of economic life, by c.1800 yields had become less variable and prices less harvest sensitive. By the eve of the Industrial Revolution, England had become effectively famine-free.
Resumo:
Large samples of multiplex pedigrees will probably be needed to detect susceptibility loci for schizophrenia by linkage analysis. Standardized ascertainment of such pedigrees from culturally and ethnically homogeneous populations may improve the probability of detection and replication of linkage. The Irish Study of High-Density Schizophrenia Families (ISHDSF) was formed from standardized ascertainment of multiplex schizophrenia families in 39 psychiatric facilities covering over 90% of the population in Ireland and Northern Ireland. We here describe a phenotypic sample and a subset thereof, the linkage sample. Individuals were included in the phenotypic sample if adequate diagnostic information, based on personal interview and/or hospital record, was available. Only individuals with available DNA were included in the linkage sample. Inclusion of a pedigree into the phenotypic sample required at least two first, second, or third degree relatives with non-affective psychosis (NAP), one whom had schizophrenia (S) or poor-outcome schizo-affective disorder (PO-SAD). Entry into the linkage sample required DNA samples on at least two individuals with NAP, of whom at least one had S or PO-SAD. Affection was defined by narrow, intermediate, and broad criteria. The phenotypic sample contained 277 pedigrees and 1,770 individuals and the linkage sample 265 pedigrees and 1,408 individuals. Using the intermediate definition of affection, the phenotypic sample contained 837 affected individuals and 526 affected sibling pairs. Parallel figures for the linkage sample were 700 and 420. Individuals with schizophrenia from these multiplex pedigrees resembled epidemiologically sampled cases with respect to age at onset, gender distribution, and most clinical symptoms, although they were more thought-disordered and had a poorer outcome. Power analyses based on the model of linkage heterogeneity indicated that the ISHDSF should be able to detect a major locus that influences susceptibility to schizophrenia in as few as 20% of families. Compared to first-degree relatives of epidemiologically sampled schizophrenic probands, first-degree relatives of schizophrenic members from the ISHDSF had a similar risk for schizotypal personality disorder, affective illness, alcoholism, and anxiety disorder. With sufficient resources, large-scale ascertainment of multiplex schizophrenia pedigrees is feasible, especially in countries with catchmented psychiatric care and stable populations. Although somewhat more severely ill, schizophrenic members of such pedigrees appear to clinically resemble typical schizophrenic patients. Our ascertainment process for multiplex schizophrenia families did not select for excess familial risk for affective illness or alcoholism. With its large sample ascertained in a standardized manner from a relatively homogeneous population, the ISHDSF provides considerable power to detect susceptibility loci for schizophrenia.
Resumo:
Cancer registries must provide complete and reliable incidence information with the shortest possible delay for use in studies such as comparability, clustering, cancer in the elderly and adequacy of cancer surveillance. Methods of varying complexity are available to registries for monitoring completeness and timeliness. We wished to know which methods are currently in use among cancer registries, and to compare the results of our findings to those of a survey carried out in 2006.
Methods
In the framework of the EUROCOURSE project, and to prepare cancer registries for participation in the ERA-net scheme, we launched a survey on the methods used to assess completeness, and also on the timeliness and methods of dissemination of results by registries. We sent the questionnaire to all general registries (GCRs) and specialised registries (SCRs) active in Europe and within the European Network of Cancer Registries (ENCR).
Results
With a response rate of 66% among GCRs and 59% among SCRs, we obtained data for analysis from 116 registries with a population coverage of ∼280 million. The most common methods used were comparison of trends (79%) and mortality/incidence ratios (more than 60%). More complex methods were used less commonly: capture–recapture by 30%, flow method by 18% and death certificate notification (DCN) methods with the Ajiki formula by 9%.
The median latency for completion of ascertainment of incidence was 18 months. Additional time required for dissemination was of the order of 3–6 months, depending on the method: print or electronic. One fifth (21%) did not publish results for their own registry but only as a contribution to larger national or international data repositories and publications; this introduced a further delay in the availability of data.
Conclusions
Cancer registries should improve the practice of measuring their completeness regularly and should move from traditional to more quantitative methods. This could also have implications in the timeliness of data publication.
Resumo:
While the benefits of renewable energy are well known and used to influence government policy there are a number of problems which arise from having significant quantities of renewable energies on an electricity grid. The most notable problem stems from their intermittent nature which is often out of phase with the demands of the end users. This requires the development of either efficient energy storage systems, e.g. battery technology, compressed air storage etc. or through the creation of demand side management units which can utilise power quickly for manufacturing operations. Herein a system performing the conversion of synthetic biogas to synthesis gas using wind power and an induction heating system is shown. This approach demonstrates the feasibility of such techniques for stabilising the electricity grid while also providing a robust means of energy storage. This exemplar is also applicable to the production of hydrogen from the steam reforming of natural gas.
Resumo:
The future European power system will have a hierarchical structure created by layers of system control from a Supergrid via regional high-voltage transmission through to medium and low-voltage distribution. Each level will have generation sources such as large-scale offshore wind, wave, solar thermal, nuclear directly connected to this Supergrid and high levels of embedded generation, connected to the medium-voltage distribution system. It is expected that the fuel portfolio will be dominated by offshore wind in Northern Europe and PV in Southern Europe. The strategies required to manage the coordination of supply-side variability with demand-side variability will include large scale interconnection, demand side management, load aggregation and storage in the context of the Supergrid combined with the Smart Grid. The design challenge associated with this will not only include control topology, data acquisition, analysis and communications technologies, but also the selection of fuel portfolio at a macro level. This paper quantifies the amount of demand side management, storage and so-called 'back-up generation' needed to support an 80% renewable energy portfolio in Europe by 2050. © 2013 IEEE.
Resumo:
This paper compares the Random Regret Minimization and the Random Utility Maximization models for determining recreational choice. The Random Regret approach is based on the idea that, when choosing, individuals aim to minimize their regret – regret being defined as what one experiences when a non-chosen alternative in a choice set performs better than a chosen one in relation to one or more attributes. The Random Regret paradigm, recently developed in transport economics, presents a tractable, regret-based alternative to the dominant choice paradigm based on Random Utility. Using data from a travel cost study exploring factors that influence kayakers’ site-choice decisions in the Republic of Ireland, we estimate both the traditional Random Utility multinomial logit model (RU-MNL) and the Random Regret multinomial logit model (RR-MNL) to gain more insights into site choice decisions. We further explore whether choices are driven by a utility maximization or a regret minimization paradigm by running a binary logit model to examine the likelihood of the two decision choice paradigms using site visits and respondents characteristics as explanatory variables. In addition to being one of the first studies to apply the RR-MNL to an environmental good, this paper also represents the first application of the RR-MNL to compute the Logsum to test and strengthen conclusions on welfare impacts of potential alternative policy scenarios.