947 resultados para methodology of indexation
Resumo:
OBJECTIVES: To determine the cost-effectiveness of influenza vaccination in people aged 65-74 years in the absence of co-morbidity. DESIGN: Primary research: randomised controlled trial. SETTING: Primary care. PARTICIPANTS: People without risk factors for influenza or contraindications to vaccination were identified from 20 general practitioner (GP) practices in Liverpool in September 1999 and invited to participate in the study. There were 5875/9727 (60.4%) people aged 65-74 years identified as potentially eligible and, of these, 729 (12%) were randomised. INTERVENTION: Participants were randomised to receive either influenza vaccine or placebo (ratio 3:1), with all individuals receiving pneumococcal vaccine unless administered in the previous 10 years. Of the 729 people randomised, 552 received vaccine and 177 received placebo; 726 individuals were administered pneumococcal vaccine. MAIN OUTCOME MEASURES AND METHODOLOGY OF ECONOMIC EVALUATION: GP attendance with influenza-like illness (ILI) or pneumonia (primary outcome measure); or any respiratory symptoms; hospitalisation with a respiratory illness; death; participant self-reported ILI; quality of life (QoL) measures at 2, 4 and 6 months post-study vaccination; adverse reactions 3 days after vaccination. A cost-effectiveness analysis was undertaken to identify the incremental cost associated with the avoidance of episodes of influenza in the vaccination population and an impact model was used to extrapolate the cost-effectiveness results obtained from the trial to assess their generalisability throughout the NHS. RESULTS: In England and Wales, weekly consultations for influenza and ILI remained at baseline levels (less than 50 per 100,000 population) until week 50/1999 and then increased rapidly, peaking during week 2/2000 with a rate of 231/100,000. This rate fell within the range of 'higher than expected seasonal activity' of 200-400/100,000. Rates then quickly declined, returning to baseline levels by week 5/2000. The predominant circulating strain during this period was influenza A (H3N2). Five (0.9%) people in the vaccine group were diagnosed by their GP with an ILI compared to two (1.1%) in the placebo group [relative risk (RR), 0.8; 95% confidence interval (CI) = 0.16 to 4.1]. No participants were diagnosed with pneumonia by their GP and there were no hospitalisations for respiratory illness in either group. Significantly fewer vaccinated individuals self-reported a single ILI (4.6% vs 8.9%, RR, 0.51; 95% CI for RR, 0.28 to 0.96). There was no significant difference in any of the QoL measurements over time between the two groups. Reported systemic side-effects showed no significant differences between groups. Local side-effects occurred with a significantly increased incidence in the vaccine group (11.3% vs 5.1%, p = 0.02). Each GP consultation avoided by vaccination was estimated from trial data to generate a net NHS cost of 174 pounds. CONCLUSIONS: No difference was seen between groups for the primary outcome measure, although the trial was underpowered to demonstrate a true difference. Vaccination had no significant effect on any of the QoL measures used, although vaccinated individuals were less likely to self-report ILI. The analysis did not suggest that influenza vaccination in healthy people aged 65-74 years would lead to lower NHS costs. Future research should look at ways to maximise vaccine uptake in people at greatest risk from influenza and also the level of vaccine protection afforded to people from different age and socio-economic populations.
Resumo:
Current mathematical models in building research have been limited in most studies to linear dynamics systems. A literature review of past studies investigating chaos theory approaches in building simulation models suggests that as a basis chaos model is valid and can handle the increasingly complexity of building systems that have dynamic interactions among all the distributed and hierarchical systems on the one hand, and the environment and occupants on the other. The review also identifies the paucity of literature and the need for a suitable methodology of linking chaos theory to mathematical models in building design and management studies. This study is broadly divided into two parts and presented in two companion papers. Part (I) reviews the current state of the chaos theory models as a starting point for establishing theories that can be effectively applied to building simulation models. Part (II) develops conceptual frameworks that approach current model methodologies from the theoretical perspective provided by chaos theory, with a focus on the key concepts and their potential to help to better understand the nonlinear dynamic nature of built environment systems. Case studies are also presented which demonstrate the potential usefulness of chaos theory driven models in a wide variety of leading areas of building research. This study distills the fundamental properties and the most relevant characteristics of chaos theory essential to building simulation scientists, initiates a dialogue and builds bridges between scientists and engineers, and stimulates future research about a wide range of issues on building environmental systems.
Resumo:
Current mathematical models in building research have been limited in most studies to linear dynamics systems. A literature review of past studies investigating chaos theory approaches in building simulation models suggests that as a basis chaos model is valid and can handle the increasing complexity of building systems that have dynamic interactions among all the distributed and hierarchical systems on the one hand, and the environment and occupants on the other. The review also identifies the paucity of literature and the need for a suitable methodology of linking chaos theory to mathematical models in building design and management studies. This study is broadly divided into two parts and presented in two companion papers. Part (I), published in the previous issue, reviews the current state of the chaos theory models as a starting point for establishing theories that can be effectively applied to building simulation models. Part (II) develop conceptual frameworks that approach current model methodologies from the theoretical perspective provided by chaos theory, with a focus on the key concepts and their potential to help to better understand the nonlinear dynamic nature of built environment systems. Case studies are also presented which demonstrate the potential usefulness of chaos theory driven models in a wide variety of leading areas of building research. This study distills the fundamental properties and the most relevant characteristics of chaos theory essential to (1) building simulation scientists and designers (2) initiating a dialogue between scientists and engineers, and (3) stimulating future research on a wide range of issues involved in designing and managing building environmental systems.
Resumo:
To investigate the perception of emotional facial expressions, researchers rely on shared sets of photos or videos, most often generated by actor portrayals. The drawback of such standardized material is a lack of flexibility and controllability, as it does not allow the systematic parametric manipulation of specific features of facial expressions on the one hand, and of more general properties of the facial identity (age, ethnicity, gender) on the other. To remedy this problem, we developed FACSGen: a novel tool that allows the creation of realistic synthetic 3D facial stimuli, both static and dynamic, based on the Facial Action Coding System. FACSGen provides researchers with total control over facial action units, and corresponding informational cues in 3D synthetic faces. We present four studies validating both the software and the general methodology of systematically generating controlled facial expression patterns for stimulus presentation.
Resumo:
Instability is a serious problem for acoustic Active Noise Cancellation (ANC) headsets as a result of large errors in estimating the transfer function of the plant. Typically this occurs when, for example, a wearer adjusts the headset. In this paper, the instability problem of adaptive ANC headset is addressed. To ensure stability of the whole system, we propose a hybrid solution consisting of an analog feedback loop parallel to the digital loop, and the role of the analog loop in stabilizing the headset is analyzed theoretically. Finally the methodology of implementing such a hybrid ANC headset is described in detail. The experiments carried out on the headset prototype show that the headset is robust under considerable fluctuations of the plant transfer characteristics, and has very good noise cancellation performance both for narrow-band and wide-band disturbances.
Resumo:
Lateral epicondylitis (LE) is hypothesized to occur as a result of repetitive, strenuous and abnormal postural activities of the elbow and wrist. There is still a lack of understanding of how wrist and forearm positions contribute to this condition during common manual tasks. In this study the wrist kinematics and the wrist extensors’ musculotendon patterns were investigated during a manual task believed to elicit LE symptoms in susceptible subjects. A 42-year-old right-handed male, with no history of LE, performed a repetitive movement involving pushing and turning a spring-loaded mechanism. Motion capture data were acquired for the upper limb and an inverse kinematic and dynamic analysis was subsequently carried out. Results illustrated the presence of eccentric contractions sustained by the extensor carpi radialis longus (ECRL), together with an almost constant level of tendon strain of both extensor carpi radialis brevis (ECRB) and extensor digitorum communis lateral (EDCL) branch. It is believed that these factors may partly contribute to the onset of LE as they are both responsible for the creation of microtears at the tendons’ origins. The methodology of this study can be used to explore muscle actions during movements that might cause or exacerbate LE.
Resumo:
This paper seeks to analyse and discuss, from the perspective of the owners of agricultural land, the main changes to the Capital Gains Tax regime introduced in the Finance Act 1998 and subsequently amended in the Finance Act 2000. The replacement of indexation with a new Taper relief is examined, along with the phasing out of Retirement relief, and the interaction of Taper relief with Rollover relief. The opportunity for tax mitigation by the owners of agricultural land is critically examined.
Resumo:
Aims: Therapeutic limbal epithelial stem cells could be managed more efficiently if clinically validated batches were transported for ‘on-demand’ use. Materials & methods: In this study, corneal epithelial cell viability in calcium alginate hydrogels was examined under cell culture, ambient and chilled conditions for up to 7 days. Results: Cell viability improved as gel internal pore size increased, and was further enhanced with modification of the gel from a mass to a thin disc. Ambient storage conditions were optimal for supporting cell viability in gel discs. Cell viability in gel discs was significantly enhanced with increases in pore size mediated by hydroxyethyl cellulose. Conclusion: Our novel methodology of controlling alginate gel shape and pore size together provides a more practical and economical alternative to established corneal tissue/cell storage methods.
Resumo:
This paper seeks to analyse and discuss, from the perspective of the owners of agricultural land, the main changes to the Capital Gains Tax regime introduced in the Budget of March 1998 and contained in the Finance Act 1998. The immediate replacement of indexation with a new Taper relief is examined, along with the phasing out of Retirement relief, and the interaction of Taper relief with Rollover relief.
Resumo:
Risk and uncertainty are, to say the least, poorly considered by most individuals involved in real estate analysis - in both development and investment appraisal. Surveyors continue to express 'uncertainty' about the value (risk) of using relatively objective methods of analysis to account for these factors. These methods attempt to identify the risk elements more explicitly. Conventionally this is done by deriving probability distributions for the uncontrolled variables in the system. A suggested 'new' way of "being able to express our uncertainty or slight vagueness about some of the qualitative judgements and not entirely certain data required in the course of the problem..." uses the application of fuzzy logic. This paper discusses and demonstrates the terminology and methodology of fuzzy analysis. In particular it attempts a comparison of the procedures with those used in 'conventional' risk analysis approaches and critically investigates whether a fuzzy approach offers an alternative to the use of probability based analysis for dealing with aspects of risk and uncertainty in real estate analysis
Resumo:
A set of random variables is exchangeable if its joint distribution function is invariant under permutation of the arguments. The concept of exchangeability is discussed, with a view towards potential application in evaluating ensemble forecasts. It is argued that the paradigm of ensembles being an independent draw from an underlying distribution function is probably too narrow; allowing ensemble members to be merely exchangeable might be a more versatile model. The question is discussed whether established methods of ensemble evaluation need alteration under this model, with reliability being given particular attention. It turns out that the standard methodology of rank histograms can still be applied. As a first application of the exchangeability concept, it is shown that the method of minimum spanning trees to evaluate the reliability of high dimensional ensembles is mathematically sound.
Resumo:
The development of effective environmental management plans and policies requires a sound understanding of the driving forces involved in shaping and altering the structure and function of ecosystems. However, driving forces, especially anthropogenic ones, are defined and operate at multiple administrative levels, which do not always match ecological scales. This paper presents an innovative methodology of analysing drivers of change by developing a typology of scale sensitivity of drivers that classifies and describes the way they operate across multiple administrative levels. Scale sensitivity varies considerably among drivers, which can be classified into five broad categories depending on the response of ‘evenness’ and ‘intensity change’ when moving across administrative levels. Indirect drivers tend to show low scale sensitivity, whereas direct drivers show high scale sensitivity, as they operate in a non-linear way across the administrative scale. Thus policies addressing direct drivers of change, in particular, need to take scale into consideration during their formulation. Moreover, such policies must have a strong spatial focus, which can be achieved either by encouraging local–regional policy making or by introducing high flexibility in (inter)national policies to accommodate increased differentiation at lower administrative levels. High quality data is available for several drivers, however, the availability of consistent data at all levels for non-anthropogenic drivers is a major constraint to mapping and assessing their scale sensitivity. This lack of data may hinder effective policy making for environmental management, since it restricts the ability to fully account for scale sensitivity of natural drivers in policy design.
Resumo:
The development of effective environmental management plans and policies requires a sound understanding of the driving forces involved in shaping and altering the structure and function of ecosystems. However, driving forces, especially anthropogenic ones, are defined and operate at multiple administrative levels, which do not always match ecological scales. This paper presents an innovative methodology of analysing drivers of change by developing a typology of scale sensitivity of drivers that classifies and describes the way they operate across multiple administrative levels. Scale sensitivity varies considerably among drivers, which can be classified into five broad categories depending on the response of ‘evenness’ and ‘intensity change’ when moving across administrative levels. Indirect drivers tend to show low scale sensitivity, whereas direct drivers show high scale sensitivity, as they operate in a non-linear way across the administrative scale. Thus policies addressing direct drivers of change, in particular, need to take scale into consideration during their formulation. Moreover, such policies must have a strong spatial focus, which can be achieved either by encouraging local–regional policy making or by introducing high flexibility in (inter)national policies to accommodate increased differentiation at lower administrative levels. High quality data is available for several drivers, however, the availability of consistent data at all levels for non-anthropogenic drivers is a major constraint to mapping and assessing their scale sensitivity. This lack of data may hinder effective policy making for environmental management, since it restricts the ability to fully account for scale sensitivity of natural drivers in policy design.
Resumo:
The article examines the customary international law credentials of the humanitarian law rules proposed by the International Committee of the Red Cross (ICR) in 2005. It relies on the BIICL/Chatham House analysis as a ‘constructive comment’ on the methodology of the ICRC study and the rules formed as a result of that methodology with respect to the dead and missing as an aid to determination of their customary law status. It shows that most of the rules studied have a customary international lawpedigree which conforms to the conclusions formed on the rules generally in the Wilmshurst and Breau study. However, the rules with respect to return of personal effects, recording location of graves and notification of relatives of access to gravesites do not seem to have even on a majoritarian/deductive approach enough volume of state practice to establish them as customary with respect to civilians.
Resumo:
As satellite technology develops, satellite rainfall estimates are likely to become ever more important in the world of food security. It is therefore vital to be able to identify the uncertainty of such estimates and for end users to be able to use this information in a meaningful way. This paper presents new developments in the methodology of simulating satellite rainfall ensembles from thermal infrared satellite data. Although the basic sequential simulation methodology has been developed in previous studies, it was not suitable for use in regions with more complex terrain and limited calibration data. Developments in this work include the creation of a multithreshold, multizone calibration procedure, plus investigations into the causes of an overestimation of low rainfall amounts and the best way to take into account clustered calibration data. A case study of the Ethiopian highlands has been used as an illustration.