5 resultados para adventitious streams

em CORA - Cork Open Research Archive - University College Cork - Ireland


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genetic decoding is not ‘frozen’ as was earlier thought, but dynamic. One facet of this is frameshifting that often results in synthesis of a C-terminal region encoded by a new frame. Ribosomal frameshifting is utilized for the synthesis of additional products, for regulatory purposes and for translational ‘correction’ of problem or ‘savior’ indels. Utilization for synthesis of additional products occurs prominently in the decoding of mobile chromosomal element and viral genomes. One class of regulatory frameshifting of stable chromosomal genes governs cellular polyamine levels from yeasts to humans. In many cases of productively utilized frameshifting, the proportion of ribosomes that frameshift at a shift-prone site is enhanced by specific nascent peptide or mRNA context features. Such mRNA signals, which can be 5′ or 3′ of the shift site or both, can act by pairing with ribosomal RNA or as stem loops or pseudoknots even with one component being 4 kb 3′ from the shift site. Transcriptional realignment at slippage-prone sequences also generates productively utilized products encoded trans-frame with respect to the genomic sequence. This too can be enhanced by nucleic acid structure. Together with dynamic codon redefinition, frameshifting is one of the forms of recoding that enriches gene expression.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim: Diabetes is an important barometer of health system performance. This chronic condition is a source of significant morbidity, premature mortality and a major contributor to health care costs. There is an increasing focus internationally, and more recently nationally, on system, practice and professional-level initiatives to promote the quality of care. The aim of this thesis was to investigate the ‘quality chasm’ around the organisation and delivery of diabetes care in general practice, to explore GPs’ attitudes to engaging in quality improvement activities and to examine efforts to improve the quality of diabetes care in Ireland from practice to policy. Methods: Quantitative and qualitative methods were used. As part of a mixed methods sequential design, a postal survey of 600 GPs was conducted to assess the organization of care. This was followed by an in-depth qualitative study using semi-structured interviews with a purposive sample of 31 GPs from urban and rural areas. The qualitative methodology was also used to examine GPs’ attitudes to engaging in quality improvement. Data were analysed using a Framework approach. A 2nd observation study was used to assess the quality of care in 63 practices with a special interest in diabetes. Data on 3010 adults with Type 2 diabetes from 3 primary care initiatives were analysed and the results were benchmarked against national guidelines and standards of care in the UK. The final study was an instrumental case study of policy formulation. Semi-structured interviews were conducted with 15 members of the Expert Advisory Group (EAG) for Diabetes. Thematic analysis was applied to the data using 3 theories of the policy process as analytical tools. Results: The survey response rate was 44% (n=262). Results suggested care delivery was largely unstructured; 45% of GPs had a diabetes register (n=157), 53% reported using guidelines (n=140), 30% had formal call recall system (n=78) and 24% had none of these organizational features (n=62). Only 10% of GPs had a formal shared protocol with the local hospital specialist diabetes team (n=26). The lack of coordination between settings was identified as a major barrier to providing optimal care leading to waiting times, overburdened hospitals and avoidable duplication. The lack of remuneration for chronic disease management had a ripple effect also creating costs for patients and apathy among GPs. There was also a sense of inertia around quality improvement activities particularly at a national level. This attitude was strongly influenced by previous experiences of change in the health system. In contrast GP’s spoke positively about change at a local level which was facilitated by a practice ethos, leadership and special interest in diabetes. The 2nd quantitative study found that practices with a special interest in diabetes achieved a standard of care comparable to the UK in terms of the recording of clinical processes of care and the achievement of clinical targets; 35% of patients reached the HbA1c target of <6.5% compared to 26% in England and Wales. With regard to diabetes policy formulation, the evolving process of action and inaction was best described by the Multiple Streams Theory. Within the EAG, the formulation of recommendations was facilitated by overarching agreement on the “obvious” priorities while the details of proposals were influenced by personal preferences and local capacity. In contrast the national decision-making process was protracted and ambiguous. The lack of impetus from senior management coupled with the lack of power conferred on the EAG impeded progress. Conclusions: The findings highlight the inconsistency of diabetes care in Ireland. The main barriers to optimal diabetes management center on the organization and coordination of care at the systems level with consequences for practice, providers and patients. Quality improvement initiatives need to stimulate a sense of ownership and interest among frontline service providers to address the local sense of inertia to national change. To date quality improvement in diabetes care has been largely dependent the “special interest” of professionals. The challenge for the Irish health system is to embed this activity as part of routine practice, professional responsibility and the underlying health care culture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anaerobic digestion (AD) of biodegradable waste is an environmentally and economically sustainable solution which incorporates waste treatment and energy recovery. The organic fraction of municipal solid waste (OFMSW), which comprises mostly of food waste, is highly degradable under anaerobic conditions. Biogas produced from OFMSW, when upgraded to biomethane, is recognised as one of the most sustainable renewable biofuels and can also be one of the cheapest sources of biomethane if a gate fee is associated with the substrate. OFMSW is a complex and heterogeneous material which may have widely different characteristics depending on the source of origin and collection system used. The research presented in this thesis investigates the potential energy resource from a wide range of organic waste streams through field and laboratory research on real world samples. OFMSW samples collected from a range of sources generated methane yields ranging from 75 to 160 m3 per tonne. Higher methane yields are associated with source segregated food waste from commercial catering premises as opposed to domestic sources. The inclusion of garden waste reduces the specific methane yield from household organic waste. In continuous AD trials it was found that a conventional continuously stirred tank reactor (CSTR) gave the highest specific methane yields at a moderate organic loading rate of 2 kg volatile solids (VS) m-3 digester day-1 and a hydraulic retention time of 30 days. The average specific methane yield obtained at this loading rate in continuous digestion was 560 ± 29 L CH4 kg-1 VS which exceeded the biomethane potential test result by 5%. The low carbon to nitrogen ratio (C: N <14:1) associated with canteen food waste lead to increasing concentrations of volatile fatty acids in line with high concentrations of ammonia nitrogen at higher organic loading rates. At an organic loading rate of 4 kg VS m-3day-1 the specific methane yield dropped considerably (381 L CH4 kg-1 VS), the pH rose to 8.1 and free ammonia (NH3 ) concentrations reached toxicity levels towards the end of the trial (ca. 950 mg L-1). A novel two phase AD reactor configuration consisting of a series of sequentially fed leach bed reactors connected to an upflow anaerobic sludge blanket (UASB) demonstrated a high rate of organic matter decay but resulted in lower specific methane yields (384 L CH4 kg-1 VS) than the conventional CSTR system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent years have witnessed a rapid growth in the demand for streaming video over the Internet, exposing challenges in coping with heterogeneous device capabilities and varying network throughput. When we couple this rise in streaming with the growing number of portable devices (smart phones, tablets, laptops) we see an ever-increasing demand for high-definition videos online while on the move. Wireless networks are inherently characterised by restricted shared bandwidth and relatively high error loss rates, thus presenting a challenge for the efficient delivery of high quality video. Additionally, mobile devices can support/demand a range of video resolutions and qualities. This demand for mobile streaming highlights the need for adaptive video streaming schemes that can adjust to available bandwidth and heterogeneity, and can provide us with graceful changes in video quality, all while respecting our viewing satisfaction. In this context the use of well-known scalable media streaming techniques, commonly known as scalable coding, is an attractive solution and the focus of this thesis. In this thesis we investigate the transmission of existing scalable video models over a lossy network and determine how the variation in viewable quality is affected by packet loss. This work focuses on leveraging the benefits of scalable media, while reducing the effects of data loss on achievable video quality. The overall approach is focused on the strategic packetisation of the underlying scalable video and how to best utilise error resiliency to maximise viewable quality. In particular, we examine the manner in which scalable video is packetised for transmission over lossy networks and propose new techniques that reduce the impact of packet loss on scalable video by selectively choosing how to packetise the data and which data to transmit. We also exploit redundancy techniques, such as error resiliency, to enhance the stream quality by ensuring a smooth play-out with fewer changes in achievable video quality. The contributions of this thesis are in the creation of new segmentation and encapsulation techniques which increase the viewable quality of existing scalable models by fragmenting and re-allocating the video sub-streams based on user requirements, available bandwidth and variations in loss rates. We offer new packetisation techniques which reduce the effects of packet loss on viewable quality by leveraging the increase in the number of frames per group of pictures (GOP) and by providing equality of data in every packet transmitted per GOP. These provide novel mechanisms for packetizing and error resiliency, as well as providing new applications for existing techniques such as Interleaving and Priority Encoded Transmission. We also introduce three new scalable coding models, which offer a balance between transmission cost and the consistency of viewable quality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain