386 resultados para Tadhg Barry


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Carbon and nitrogen stable isotope analysis (SIA) has identified the terrestrial subsidy of freshwater food-webs but relies on different 13C fractionation in aquatic and terrestrial primary producers. However dissolved inorganic carbon (DIC) is partly comprised of 13C depleted respiration of terrestrial C and ‘old’ C derived from weathering of catchment geology. SIA thus fails to differentiate between the contribution of old and recently fixed terrestrial C. DIC in alkaline lakes is partially derived from weathering of 14C-free carbonaceous bedrock This
yields an artificial age offset leading samples to appear significantly older than their actual age. As such, 14C can be used as a biomarker to identify the proportion of autochthonous C in the food-web. With terrestrial C inputs likely to increase, the origin and utilisation of ‘old’ or ‘recent’ allochthonous C in the food-web can also be determined. Stable isotopes and 14C were measured for biota, particulate organic matter (POM), DIC and dissolved organic carbon (DOC) from Lough Erne, Northern Ireland, a humic but alkaline lake. High winter δ15N values in calanoid zooplankton (δ15N =24‰) relative to phytoplankton and POM (δ15N =6‰ and 12‰ respectively) may reflect several microbial trophic levels between terrestrial C and calanoids. Furthermore winter calanoid 14C ages are consistent with DOC from inflowing rivers (87 and 75 years BP respectively) but not phytoplankton (355 years BP). Summer calanoid δ13N, δ15N and 14C (312 years BP) indicate greater reliance on phytoplankton. There is also temporal and spatial variation in DIC, DOC and POM C isotopes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lake food webs were in the past viewed as being fuelled solely by primary production – i.e. by photosynthetic plants and algae. However this has changed as the exports of terrestrial areas into lakes have been taken into account. Previously, terrestrial carbon in lakes was thought to have been buried in sediments or exported to the atmosphere, however recent studies have indicated that terrestrial carbon can supplement primary production in some lakes, or in others be the dominant source of production for the lake food web. In this study radiocarbon has been used in conjunction with stable carbon and nitrogen isotopes to show the utilisation of terrestrial carbon in the food web. The fate of terrestrial carbon in the lake will be discussed as well as the possible mechanisms for the transfer of terrestrial carbon for utilisation in the lake.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the numerical simulation of the ultimate behaviour of 85 one-way and two-way spanning laterally restrained concrete slabs of variable thickness, span, reinforcement ratio, strength and boundary conditions reported in literature by different authors. The developed numerical model was described and all the assumptions were illustrated. ABAQUS, a Finite Element Analysis suite of software, was employed. Non-linear implicit static general analysis method offered by ABAQUS was used. Other analysis methods were also discussed in general in terms of application such as Explicit Dynamic Analysis and Riks method. The aim is to demonstrate the ability and efficacy of FEA to simulate the ultimate load behaviour of slabs considering different material properties and boundary conditions. The authors intended to present a numerical model that provides consistent predictions of the ultimate behaviour of laterally restrained slabs that could be used as an alternative for expensive real life testing as well as for the design and assessment of new and existing structures respectively. The enhanced strength of laterally-restrained slabs compared with conventional design methods predictions is believed to be due to compressive membrane action (CMA). CMA is an inherent phenomenon of laterally restrained concrete beams/slabs. The numerical predictions obtained from the developed model were in good correlation with the experimental results and with those obtained from the CMA method developed at the Queen’s University Belfast, UK.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background A 2014 national audit used the English General Practice Patient Survey (GPPS) to compare service users’ experience of out-of-hours general practitioner (GP) services, yet there is no published evidence on the validity of these GPPS items. Objectives Establish the construct and concurrent validity of GPPS items evaluating service users’ experience of GP out-of-hours care. Methods Cross-sectional postal survey of service users (n=1396) of six English out-of-hours providers. Participants reported on four GPPS items evaluating out-of-hours care (three items modified following cognitive interviews with service users), and 14 evaluative items from the Out-of-hours Patient Questionnaire (OPQ). Construct validity was assessed through correlations between any reliable (Cochran's α>0.7) scales, as suggested by a principal component analysis of the modified GPPS items, with the ‘entry access’ (four items) and ‘consultation satisfaction’ (10 items) OPQ subscales. Concurrent validity was determined by investigating whether each modified GPPS item was associated with thematically related items from the OPQ using linear regressions. Results The modified GPPS item-set formed a single scale (α=0.77), which summarised the two-component structure of the OPQ moderately well; explaining 39.7% of variation in the ‘entry access’ scores (r=0.63) and 44.0% of variation in the ‘consultation satisfaction’ scores (r=0.66), demonstrating acceptable construct validity. Concurrent validity was verified as each modified GPPS item was highly associated with a distinct set of related items from the OPQ. Conclusions Minor modifications are required for the English GPPS items evaluating out-of-hours care to improve comprehension by service users. A modified question set was demonstrated to comprise a valid measure of service users’ overall satisfaction with out-of-hours care received. This demonstrates the potential for the use of as few as four items in benchmarking providers and assisting services in identifying, implementing and assessing quality improvement initiatives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background English National Quality Requirements mandate out-of-hours primary care services to routinely audit patient experience, but do not state how it should be done.

Objectives We explored how providers collect patient feedback data and use it to inform service provision. We also explored staff views on the utility of out-of-hours questions from the English General Practice Patient Survey (GPPS).

Methods A qualitative study was conducted with 31 staff (comprising service managers, general practitioners and administrators) from 11 out-of-hours primary care providers in England, UK. Staff responsible for patient experience audits within their service were sampled and data collected via face-to-face semistructured interviews.

Results Although most providers regularly audited their patients’ experiences by using patient surveys, many participants expressed a strong preference for additional qualitative feedback. Staff provided examples of small changes to service delivery resulting from patient feedback, but service-wide changes were not instigated. Perceptions that patients lacked sufficient understanding of the urgent care system in which out-of-hours primary care services operate were common and a barrier to using feedback to enable change. Participants recognised the value of using patient experience feedback to benchmark services, but perceived weaknesses in the out-of-hours items from the GPPS led them to question the validity of using these data for benchmarking in its current form.

Conclusions The lack of clarity around how out-of-hours providers should audit patient experience hinders the utility of the National Quality Requirements. Although surveys were common, patient feedback data had only a limited role in service change. Data derived from the GPPS may be used to benchmark service providers, but refinement of the out-of-hours items is needed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A 94 GHz waveguide Rotman lens is described which can be used to implement an amplitude comparison monopulse RADAR. In transmit mode, adjacent dual beam ports are excited with equal amplitude and phase to form a sum radiation pattern, and in receive mode, the outputs of the beam port pairs are combined using magic tees to provide a sum and a difference signal which can be used to calculate an angular error estimate for target acquisition and tracking. This approach provides an amplitude comparison monopulse system which can be scanned in azimuth and which has a low component count, with no requirement for phase shift circuitry in the array feed lines, making it suitable for mm-wave frequencies. A 12 input (beam ports), 12 output (array ports) lens is designed using CST Microwave Studio, and the predicted results are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, we investigate an adaptive decomposition and ordering strategy that automatically divides examinations into difficult and easy sets for constructing an examination timetable. The examinations in the difficult set are considered to be hard to place and hence are listed before the ones in the easy set in the construction process. Moreover, the examinations within each set are ordered using different strategies based on graph colouring heuristics. Initially, the examinations are placed into the easy set. During the construction process, examinations that cannot be scheduled are identified as the ones causing infeasibility and are moved forward in the difficult set to ensure earlier assignment in subsequent attempts. On the other hand, the examinations that can be scheduled remain in the easy set.

Within the easy set, a new subset called the boundary set is introduced to accommodate shuffling strategies to change the given ordering of examinations. The proposed approach, which incorporates different ordering and shuffling strategies, is explored on the Carter benchmark problems. The empirical results show that the performance of our algorithm is broadly comparable to existing constructive approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: High risk medications are commonly prescribed to older US patients. Currently, less is known about high risk medication prescribing in other Western Countries, including the UK. We measured trends and correlates of high risk medication prescribing in a subset of the older UK population (community/institutionalized) to inform harm minimization efforts. Methods: Three cross-sectional samples from primary care electronic clinical records (UK Clinical Practice Research Datalink, CPRD) in fiscal years 2003/04, 2007/08 and 2011/12 were taken. This yielded a sample of 13,900 people aged 65 years or over from 504 UK general practices. High risk medications were defined by 2012 Beers Criteria adapted for the UK. Using descriptive statistical methods and regression modelling, prevalence of ‘any’ (drugs prescribed at least once per year) and ‘long-term’ (drugs prescribed all quarters of year) high risk medication prescribing and correlates were determined. Results: While polypharmacy rates have risen sharply, high risk medication prevalence has remained stable across a decade. A third of older (65+) people are exposed to high risk medications, but only half of the total prevalence was long-term (any = 38.4 % [95 % CI: 36.3, 40.5]; long-term = 17.4 % [15.9, 19.9] in 2011/12). Long-term but not any high risk medication exposure was associated with older ages (85 years or over). Women and people with higher polypharmacy burden were at greater risk of exposure; lower socio-economic status was not associated. Ten drugs/drug classes accounted for most of high risk medication prescribing in 2011/12. Conclusions: High risk medication prescribing has not increased over time against a background of increasing polypharmacy in the UK. Half of patients receiving high risk medications do so for less than a year. Reducing or optimising the use of a limited number of drugs could dramatically reduce high risk medications in older people. Further research is needed to investigate why the oldest old and women are at greater risk. Interventions to reduce high risk medications may need to target shorter and long-term use separately.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is concerned with the application of an automated hybrid approach in addressing the university timetabling problem. The approach described is based on the nature-inspired artificial bee colony (ABC) algorithm. An ABC algorithm is a biologically-inspired optimization approach, which has been widely implemented in solving a range of optimization problems in recent years such as job shop scheduling and machine timetabling problems. Although the approach has proven to be robust across a range of problems, it is acknowledged within the literature that there currently exist a number of inefficiencies regarding the exploration and exploitation abilities. These inefficiencies can often lead to a slow convergence speed within the search process. Hence, this paper introduces a variant of the algorithm which utilizes a global best model inspired from particle swarm optimization to enhance the global exploration ability while hybridizing with the great deluge (GD) algorithm in order to improve the local exploitation ability. Using this approach, an effective balance between exploration and exploitation is attained. In addition, a traditional local search approach is incorporated within the GD algorithm with the aim of further enhancing the performance of the overall hybrid method. To evaluate the performance of the proposed approach, two diverse university timetabling datasets are investigated, i.e., Carter's examination timetabling and Socha course timetabling datasets. It should be noted that both problems have differing complexity and different solution landscapes. Experimental results demonstrate that the proposed method is capable of producing high quality solutions across both these benchmark problems, showing a good degree of generality in the approach. Moreover, the proposed method produces best results on some instances as compared with other approaches presented in the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Neutrophil elastase (NE) is a serine protease implicated in the pathogenesis of several respiratory diseases including cystic fibrosis (CF). The presence of free NE in BAL is a predictor of subsequent bronchiectasis in children with CF (Sly et al, 2013, NEJM 368: 1963-1970). Furthermore, children with higher levels of sputum NE activity (NEa) tend to experience a more rapid decline in FEV1 over time even after adjusting for age, gender and baseline FEV1 (Sagel et al, 2012, AJRCCM 186: 857-865). Its detection and quantification in biological samples is however confounded by a lack of robust methodologies. Standard assays using chromogenic or fluorogenic substrates are not specific when added to complex samples containing multiple proteolytic and hydrolytic enzymes. ELISA systems measure total protein levels which can be a mixture of latent, active and protease-inhibitor complexes. We have therefore developed a novel assay (ProteaseTag™ Active NE Immunoassay), which couples an activity dependent NE-Tag with a specific antibody step, resulting in an assay which is both selective and specific for NEa. Aims: To clinically validate ProteaseTag™ Active NE for the detection of free NEa in BAL from children with CF. Methods: A total of 95 paediatric BAL samples [CF (n=76; 44M, 32F) non-CF (n=19; 12M, 7F)] collected through the Study of Host Immunity and Early Lung Disease in CF (SHIELD CF) were analysed for NEa using ProteaseTag™ Active NE (ProAxsis Ltd) and a fluorogenic substrate-based assay utilising Suc-AAPV-AMC (Sigma). IL-8 was measured by ELISA (R&D Systems). Results were analysed to show comparisons in free NEa between CF and non-CF samples alongside correlations with a range of clinical parameters. Results: NEa measured by ProteaseTag™ Active NE correlated significantly with age (r=0.3, p=0.01) and highly significantly with both IL-8 (r=0.4, p=<0.0001) and the absolute neutrophil count (ANC) (r=0.4, p=<0.0001). These correlations were not observed when NEa was measured by the substrate assay even though a significant correlation was found between the two assays (r=0.8, p<0.0001). A trend towards significance was found between NEa in the CF and non-CF groups when measured by ProteaseTag™ Active NE (p=0.07). Highly significant differences were found with the other inflammatory parameters between the 2 groups (IL-8: p=0.0002 and ANC: p=0.006). Conclusion: NEa as a primary efficacy endpoint in clinical trials or as a marker of inflammation within the clinic has been hampered by the lack of a robust and simple to use assay. ProteaseTag™ Active NE has been shown to be a specific and superior tool in the measurement of NEa in paediatric CF BAL samples (supporting data from previous studies using adult CF expectorated samples). The technology is currently being transferred to a lateral flow device for use at Point of Care. Acknowledgements: This work was supported by the National Children’s Research Centre, Dublin (SHIELD CF) and grants from the Medical Research Council and Cystic Fibrosis Foundation Therapeutics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Generating timetables for an institution is a challenging and time consuming task due to different demands on the overall structure of the timetable. In this paper, a new hybrid method which is a combination of a great deluge and artificial bee colony algorithm (INMGD-ABC) is proposed to address the university timetabling problem. Artificial bee colony algorithm (ABC) is a population based method that has been introduced in recent years and has proven successful in solving various optimization problems effectively. However, as with many search based approaches, there exist weaknesses in the exploration and exploitation abilities which tend to induce slow convergence of the overall search process. Therefore, hybridization is proposed to compensate for the identified weaknesses of the ABC. Also, inspired from imperialist competitive algorithms, an assimilation policy is implemented in order to improve the global exploration ability of the ABC algorithm. In addition, Nelder–Mead simplex search method is incorporated within the great deluge algorithm (NMGD) with the aim of enhancing the exploitation ability of the hybrid method in fine-tuning the problem search region. The proposed method is tested on two differing benchmark datasets i.e. examination and course timetabling datasets. A statistical analysis t-test has been conducted and shows the performance of the proposed approach as significantly better than basic ABC algorithm. Finally, the experimental results are compared against state-of-the art methods in the literature, with results obtained that are competitive and in certain cases achieving some of the current best results to those in the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Good Laboratory Practice has been a part of non-clinical research for over 40 years. Optimization Research, despite having many papers discussing standards being published over the same period of time, has yet to embrace standards that underpin its research. In this paper we argue the need to adopt standards in optimization research. Building on previous papers, many of which have suggested that the optimization research community should adopt certain standards, we suggest a concrete set of recommendations that the community should adopt. We also discuss how the proposals in this paper could be progressed.