14 resultados para Long-term Follow-up

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Huntington's disease (HD) is a fatal autosomal dominant neurodegenerative disease involving progressive motor, cognitive and behavioural decline, leading to death approximately 20 years after motor onset. The disease is characterised pathologically by an early and progressive striatal neuronal cell loss and atrophy, which has provided the rationale for first clinical trials of neural repair using fetal striatal cell transplantation. Between 2000 and 2003, the 'NEST-UK' consortium carried out bilateral striatal transplants of human fetal striatal tissue in five HD patients. This paper describes the long-term follow up over a 3-10-year postoperative period of the patients, grafted and non-grafted, recruited to this cohort using the 'Core assessment program for intracerebral transplantations-HD' assessment protocol. No significant differences were found over time between the patients, grafted and non-grafted, on any subscore of the Unified Huntington's Disease Rating Scale, nor on the Mini Mental State Examination. There was a trend towards a slowing of progression on some timed motor tasks in four of the five patients with transplants, but overall, the trial showed no significant benefit of striatal allografts in comparison with a reference cohort of patients without grafts. Importantly, no significant adverse or placebo effects were seen. Notably, the raclopride positron emission tomography (PET) signal in individuals with transplants, indicated that there was no obvious surviving striatal graft tissue. This study concludes that fetal striatal allografting in HD is safe. While no sustained functional benefit was seen, we conclude that this may relate to the small amount of tissue that was grafted in this safety study compared with other reports of more successful transplants in patients with HD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: To clarify the role of growth monitoring in primary school children, including obesity, and to examine issues that might impact on the effectiveness and cost-effectiveness of such programmes. Data sources: Electronic databases were searched up to July 2005. Experts in the field were also consulted. Review methods: Data extraction and quality assessment were performed on studies meeting the review's inclusion criteria. The performance of growth monitoring to detect disorders of stature and obesity was evaluated against National Screening Committee (NSC) criteria. Results: In the 31 studies that were included in the review, there were no controlled trials of the impact of growth monitoring and no studies of the diagnostic accuracy of different methods for growth monitoring. Analysis of the studies that presented a 'diagnostic yield' of growth monitoring suggested that one-off screening might identify between 1: 545 and 1: 1793 new cases of potentially treatable conditions. Economic modelling suggested that growth monitoring is associated with health improvements [ incremental cost per quality-adjusted life-year (QALY) of pound 9500] and indicated that monitoring was cost-effective 100% of the time over the given probability distributions for a willingness to pay threshold of pound 30,000 per QALY. Studies of obesity focused on the performance of body mass index against measures of body fat. A number of issues relating to human resources required for growth monitoring were identified, but data on attitudes to growth monitoring were extremely sparse. Preliminary findings from economic modelling suggested that primary prevention may be the most cost-effective approach to obesity management, but the model incorporated a great deal of uncertainty. Conclusions: This review has indicated the potential utility and cost-effectiveness of growth monitoring in terms of increased detection of stature-related disorders. It has also pointed strongly to the need for further research. Growth monitoring does not currently meet all NSC criteria. However, it is questionable whether some of these criteria can be meaningfully applied to growth monitoring given that short stature is not a disease in itself, but is used as a marker for a range of pathologies and as an indicator of general health status. Identification of effective interventions for the treatment of obesity is likely to be considered a prerequisite to any move from monitoring to a screening programme designed to identify individual overweight and obese children. Similarly, further long-term studies of the predictors of obesity-related co-morbidities in adulthood are warranted. A cluster randomised trial comparing growth monitoring strategies with no growth monitoring in the general population would most reliably determine the clinical effectiveness of growth monitoring. Studies of diagnostic accuracy, alongside evidence of effective treatment strategies, could provide an alternative approach. In this context, careful consideration would need to be given to target conditions and intervention thresholds. Diagnostic accuracy studies would require long-term follow-up of both short and normal children to determine sensitivity and specificity of growth monitoring.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examine how the development of three types of career capital (knowing how, knowing whom, and knowing why) during an international assignment affects the perceived marketability of organizational expatriates. Using the perceived marketability perspective and long-term follow-up data, we show that knowing how is seen as the most transferable type of career capital, while the development of other aspects of career capital has little impact on perceived marketability. We also show that career capital development is more recognized in the external market than by current employers. Our findings expand our understanding of long-term career marketability among people who have completed international assignments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The sustainability of cereal/legume intercropping was assessed by monitoring trends in grain yield, soil organic C (SOC) and soil extractable P (Olsen method) measured over 13 years at a long-term field trial on a P-deficient soil in semi-arid Kenya. Goat manure was applied annually for 13 years at 0, 5 and 10 t ha(-1) and trends in grain yield were not identifiable because of season-to-season variations. SOC and Olsen P increased for the first seven years of manure application and then remained constant. The residual effect of manure applied for four years only lasted another seven to eight years when assessed by yield, SOC and Olsen P. Mineral fertilizers provided the same annual rates of N and P as in 5 t ha(-1) manure and initially ,gave the same yield as manure, declining after nine years to about 80%. Therefore, manure applications could be made intermittently and nutrient requirements topped-up with fertilizers. Grain yields for sorghum with continuous manure were described well by correlations with rainfall and manure input only, if data were excluded for seasons with over 500 mm rainfall. A comprehensive simulation model should correctly describe crop losses caused by excess water.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Land plants have had the reputation of being problematic for DNA barcoding for two general reasons: (i) the standard DNA regions used in algae, animals and fungi have exceedingly low levels of variability and (ii) the typically used land plant plastid phylogenetic markers (e.g. rbcL, trnL-F, etc.) appear to have too little variation. However, no one has assessed how well current phylogenetic resources might work in the context of identification (versus phylogeny reconstruction). In this paper, we make such an assessment, particularly with two of the markers commonly sequenced in land plant phylogenetic studies, plastid rbcL and internal transcribed spacers of the large subunits of nuclear ribosomal DNA (ITS), and find that both of these DNA regions perform well even though the data currently available in GenBank/EBI were not produced to be used as barcodes and BLAST searches are not an ideal tool for this purpose. These results bode well for the use of even more variable regions of plastid DNA (such as, for example, psbA-trnH) as barcodes, once they have been widely sequenced. In the short term, efforts to bring land plant barcoding up to the standards being used now in other organisms should make swift progress. There are two categories of DNA barcode users, scientists in fields other than taxonomy and taxonomists. For the former, the use of mitochondrial and plastid DNA, the two most easily assessed genomes, is at least in the short term a useful tool that permits them to get on with their studies, which depend on knowing roughly which species or species groups they are dealing with, but these same DNA regions have important drawbacks for use in taxonomic studies (i.e. studies designed to elucidate species limits). For these purposes, DNA markers from uniparentally (usually maternally) inherited genomes can only provide half of the story required to improve taxonomic standards being used in DNA barcoding. In the long term, we will need to develop more sophisticated barcoding tools, which would be multiple, low-copy nuclear markers with sufficient genetic variability and PCR-reliability; these would permit the detection of hybrids and permit researchers to identify the 'genetic gaps' that are useful in assessing species limits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The suitability of cryopreservation for the secure, long-term storage of the rare and endangered species Cosmos atrosanguineus was investigated. Using encapsulation/dehydration of shoot tips in alginate strips, survival rates of up to 100 % and shoot regeneration of up to 35 % were achieved. Light and electron microscopy studies indicated that cellular damage to some regions of the shoot tip during the freeze/thaw procedure was high, although cell survival in and around the meristematic region allowed shoot tip regeneration. The genetic fingerprinting technique, amplified fragment length polymorphisms (AFLPs), showed that no detectable genetic variation was present between material of C. atrosanguineus at the time of initiation into tissue culture and that which had been cryopreserved, stored in liquid nitrogen for 12 months and regenerated. Wearied plantlets that were grown under glasshouse conditions exhibited no morphological variation from non-frozen controls. (C) 2003 Annals of Botany Company.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cryopreservation using encapsulation-dehydration was developed for the long-term conservation of cocoa (Theobroma cacao L.) germplasm. Survival of individually encapsulated somatic embryos after desiccation and cryopreservation was achieved through optimization of cryoprotectants (abscisic acid (ABA) and sugar), duration of osmotic and evaporative dehydration, and embryo development stage. Up to 63% of the genotype SPA4 early-cotyledonary somatic embryos survived cryopreservation following 7 days preculture with 1 M sucrose and 4 h silica exposure (16% moisture content in bead). This optimized protocol was successfully applied to three other genotypes, e.g. EET272, IMC14 and AMAZ12, with recovery frequencies of 25, 40 and 72%, respectively (but the latter two genotypes using 0.75 M sucrose). Recovered SPA4 somatic embryos converted to plants at a rate of 33% and the regenerated plants were phenotypically comparable to non-cryopreserved somatic embryo-derived plants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The suitability of cryopreservation for the secure, long-term storage of the rare and endangered species Cosmos atrosanguineus was investigated. Using encapsulation/dehydration of shoot tips in alginate strips, survival rates of up to 100 % and shoot regeneration of up to 35 % were achieved. Light and electron microscopy studies indicated that cellular damage to some regions of the shoot tip during the freeze/thaw procedure was high, although cell survival in and around the meristematic region allowed shoot tip regeneration. The genetic fingerprinting technique, amplified fragment length polymorphisms (AFLPs), showed that no detectable genetic variation was present between material of C. atrosanguineus at the time of initiation into tissue culture and that which had been cryopreserved, stored in liquid nitrogen for 12 months and regenerated. Weaned plantlets that were grown under glasshouse conditions exhibited no morphological variation from non-frozen controls.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Explosive volcanic eruptions cause episodic negative radiative forcing of the climate system. Using coupled atmosphere-ocean general circulation models (AOGCMs) subjected to historical forcing since the late nineteenth century, previous authors have shown that each large volcanic eruption is associated with a sudden drop in ocean heat content and sea-level from which the subsequent recovery is slow. Here we show that this effect may be an artefact of experimental design, caused by the AOGCMs not having been spun up to a steady state with volcanic forcing before the historical integrations begin. Because volcanic forcing has a long-term negative average, a cooling tendency is thus imposed on the ocean in the historical simulation. We recommend that an extra experiment be carried out in parallel to the historical simulation, with constant time-mean historical volcanic forcing, in order to correct for this effect and avoid misinterpretation of ocean heat content changes

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent solar minimum was the longest and deepest of the space age, with the lowest average sunspot numbers for nearly a century. The Sun appears to be exiting a grand solar maximum (GSM) of activity which has persisted throughout the space age, and is headed into a significantly quieter period. Indeed, initial observations of solar cycle 24 (SC24) continue to show a relatively low heliospheric magnetic field strength and sunspot number (R), despite the average latitude of sunspots and the inclination of the heliospheric current sheet showing the rise to solar maximum is well underway. We extrapolate the available SC24 observations forward in time by assuming R will continue to follow a similar form to previous cycles, despite the end of the GSM, and predict a very weak cycle 24, with R peaking at ∼65–75 around the middle/end of 2012. Similarly, we estimate the heliospheric magnetic field strength will peak around 6nT. We estimate that average galactic cosmic ray fluxes above 1GV rigidity will be ∼10% higher in SC24 than SC23 and that the probability of a large SEP event during this cycle is 0.8, compared to 0.5 for SC23. Comparison of the SC24 R estimates with previous ends of GSMs inferred from 9300 years of cosmogenic isotope data places the current evolution of the Sun and heliosphere in the lowest 5% of cases, suggesting Maunder Minimum conditions are likely within the next 40 years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The UK has a target for an 80% reduction in CO2 emissions by 2050 from a 1990 base. Domestic energy use accounts for around 30% of total emissions. This paper presents a comprehensive review of existing models and modelling techniques and indicates how they might be improved by considering individual buying behaviour. Macro (top-down) and micro (bottom-up) models have been reviewed and analysed. It is found that bottom-up models can project technology diffusion due to their higher resolution. The weakness of existing bottom-up models at capturing individual green technology buying behaviour has been identified. Consequently, Markov chains, neural networks and agent-based modelling are proposed as possible methods to incorporate buying behaviour within a domestic energy forecast model. Among the three methods, agent-based models are found to be the most promising, although a successful agent approach requires large amounts of input data. A prototype agent-based model has been developed and tested, which demonstrates the feasibility of an agent approach. This model shows that an agent-based approach is promising as a means to predict the effectiveness of various policy measures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

More and more households are purchasing electric vehicles (EVs), and this will continue as we move towards a low carbon future. There are various projections as to the rate of EV uptake, but all predict an increase over the next ten years. Charging these EVs will produce one of the biggest loads on the low voltage network. To manage the network, we must not only take into account the number of EVs taken up, but where on the network they are charging, and at what time. To simulate the impact on the network from high, medium and low EV uptake (as outlined by the UK government), we present an agent-based model. We initialise the model to assign an EV to a household based on either random distribution or social influences - that is, a neighbour of an EV owner is more likely to also purchase an EV. Additionally, we examine the effect of peak behaviour on the network when charging is at day-time, night-time, or a mix of both. The model is implemented on a neighbourhood in south-east England using smart meter data (half hourly electricity readings) and real life charging patterns from an EV trial. Our results indicate that social influence can increase the peak demand on a local level (street or feeder), meaning that medium EV uptake can create higher peak demand than currently expected.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The long-term changes in the main tidal constituents (O1, K1, M2, N2, and S2) along the coasts of China and in adjacent seas are investigated based on 17 tide-gauge records covering the period 1954–2012. The observed 18.61 year nodal modulations of the diurnal constituents O1 and K1 are in agreement with the equilibrium tidal theory, except in the South China Sea. The observed modulations of the M2 and N2 amplitudes are smaller than theoretically predicted at the northern stations and larger at the southern stations. The discrepancies between the theoretically predicted nodal variations and the observations are discussed. The 8.85 year perigean cycle is identifiable in the N2 parameters at most stations, except those in the South China Sea. The radiational component of S2 contributes on average 16% of the observed S2 except in the Gulf of Tonkin, on the south coast, where it accounts for up to 65%. We confirmed the existence of nodal modulation in S2, which is stronger on the north coast. The semidiurnal tidal parameters show significant secular trends in the Bohai and Yellow Seas, on the north coast, and in the Taiwan Strait. The largest increase is found for M2 for which the amplitude increases by 4–7 mm/yr in the Yellow Sea. The potential causes for the linear trends in tidal constants are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A one-dimensional surface energy-balance lake model, coupled to a thermodynamic model of lake ice, is used to simulate variations in the temperature of and evaporation from three Estonian lakes: Karujärv, Viljandi and Kirjaku. The model is driven by daily climate data, derived by cubic-spline interpolation from monthly mean data, and was run for periods of 8 years (Kirjaku) up to 30 years (Viljandi). Simulated surface water temperature is in good agreement with observations: mean differences between simulated and observed temperatures are from −0.8°C to +0.1°C. The simulated duration of snow and ice cover is comparable with observed. However, the model generally underpredicts ice thickness and overpredicts snow depth. Sensitivity analyses suggest that the model results are robust across a wide range (0.1–2.0 m−1) of lake extinction coefficient: surface temperature differs by less than 0.5°C between extreme values of the extinction coefficient. The model results are more sensitive to snow and ice albedos. However, changing the snow (0.2–0.9) and ice (0.15–0.55) albedos within realistic ranges does not improve the simulations of snow depth and ice thickness. The underestimation of ice thickness is correlated with the overestimation of snow cover, since a thick snow layer insulates the ice and limits ice formation. The overestimation of snow cover results from the assumption that all the simulated winter precipitation occurs as snow, a direct consequence of using daily climate data derived by interpolation from mean monthly data.