91 resultados para College degree merchandise

em CORA - Cork Open Research Archive - University College Cork - Ireland


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this PhD study, mathematical modelling and optimisation of granola production has been carried out. Granola is an aggregated food product used in breakfast cereals and cereal bars. It is a baked crispy food product typically incorporating oats, other cereals and nuts bound together with a binder, such as honey, water and oil, to form a structured unit aggregate. In this work, the design and operation of two parallel processes to produce aggregate granola products were incorporated: i) a high shear mixing granulation stage (in a designated granulator) followed by drying/toasting in an oven. ii) a continuous fluidised bed followed by drying/toasting in an oven. In addition, the particle breakage of granola during pneumatic conveying produced by both a high shear granulator (HSG) and fluidised bed granulator (FBG) process were examined. Products were pneumatically conveyed in a purpose built conveying rig designed to mimic product conveying and packaging. Three different conveying rig configurations were employed; a straight pipe, a rig consisting two 45° bends and one with 90° bend. It was observed that the least amount of breakage occurred in the straight pipe while the most breakage occurred at 90° bend pipe. Moreover, lower levels of breakage were observed in two 45° bend pipe than the 90° bend vi pipe configuration. In general, increasing the impact angle increases the degree of breakage. Additionally for the granules produced in the HSG, those produced at 300 rpm have the lowest breakage rates while the granules produced at 150 rpm have the highest breakage rates. This effect clearly the importance of shear history (during granule production) on breakage rates during subsequent processing. In terms of the FBG there was no single operating parameter that was deemed to have a significant effect on breakage during subsequent conveying. A population balance model was developed to analyse the particle breakage occurring during pneumatic conveying. The population balance equations that govern this breakage process are solved using discretization. The Markov chain method was used for the solution of PBEs for this process. This study found that increasing the air velocity (by increasing the air pressure to the rig), results in increased breakage among granola aggregates. Furthermore, the analysis carried out in this work provides that a greater degree of breakage of granola aggregates occur in line with an increase in bend angle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The overall aim of this study was to investigate the extent to which and ways in which Irish relief and development nongovernmental organisations (NGOs) were linked with the concepts of legitimacy and accountability in Irish Times newspaper coverage between 1994 and 2009. This research was based on a quantitative content analysis of 215 Irish Times articles and the results were analysed using statistical methods. Key findings of the research included that NGO accountability received significantly more coverage than NGO legitimacy, "principal-agent" approaches to NGO accountability received significantly more coverage than "stakeholder" approaches to NGO accountability, and questioning of NGOs based on either their accountability or legitimacy was very limited. It is suggested that these findings may indicate both a failure by Irish NGOs to promote "development literacy" and global solidarity among the Irish public, and a limited degree of "development literacy" and global solidarity among the Irish public.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Much work has been done on learning from failure in search to boost solving of combinatorial problems, such as clause-learning and clause-weighting in boolean satisfiability (SAT), nogood and explanation-based learning, and constraint weighting in constraint satisfaction problems (CSPs). Many of the top solvers in SAT use clause learning to good effect. A similar approach (nogood learning) has not had as large an impact in CSPs. Constraint weighting is a less fine-grained approach where the information learnt gives an approximation as to which variables may be the sources of greatest contention. In this work we present two methods for learning from search using restarts, in order to identify these critical variables prior to solving. Both methods are based on the conflict-directed heuristic (weighted-degree heuristic) introduced by Boussemart et al. and are aimed at producing a better-informed version of the heuristic by gathering information through restarting and probing of the search space prior to solving, while minimizing the overhead of these restarts. We further examine the impact of different sampling strategies and different measurements of contention, and assess different restarting strategies for the heuristic. Finally, two applications for constraint weighting are considered in detail: dynamic constraint satisfaction problems and unary resource scheduling problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is much common ground between the areas of coding theory and systems theory. Fitzpatrick has shown that a Göbner basis approach leads to efficient algorithms in the decoding of Reed-Solomon codes and in scalar interpolation and partial realization. This thesis simultaneously generalizes and simplifies that approach and presents applications to discrete-time modeling, multivariable interpolation and list decoding. Gröbner basis theory has come into its own in the context of software and algorithm development. By generalizing the concept of polynomial degree, term orders are provided for multivariable polynomial rings and free modules over polynomial rings. The orders are not, in general, unique and this adds, in no small way, to the power and flexibility of the technique. As well as being generating sets for ideals or modules, Gröbner bases always contain a element which is minimal with respect tot the corresponding term order. Central to this thesis is a general algorithm, valid for any term order, that produces a Gröbner basis for the solution module (or ideal) of elements satisfying a sequence of generalized congruences. These congruences, based on shifts and homomorphisms, are applicable to a wide variety of problems, including key equations and interpolations. At the core of the algorithm is an incremental step. Iterating this step lends a recursive/iterative character to the algorithm. As a consequence, not all of the input to the algorithm need be available from the start and different "paths" can be taken to reach the final solution. The existence of a suitable chain of modules satisfying the criteria of the incremental step is a prerequisite for applying the algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An important component of this Ph.D. thesis was to determine the European consumers’ views on processed meats and bioactive compounds. Thus a survey gathered information form over 500 respondents and explored their perceptions on the healthiness and purchase-ability for both traditional and functional processed meats. This study found that the consumer was distrustful towards processed meat, especially high salt and fat content. Consumers were found to be very pro-bioactive compounds in yogurt style products but unsure of their feelings on the idea of them in meat based products, which is likely due to the lack of familiarity to these products. The work in this thesis also centred on the applied acceptable reduction of salt and fat in terms of consumer sensory analysis. The products chosen ranged in the degree of comminution, from a coarse beef patty to a more fine emulsion style breakfast sausage and frankfurter. A full factorial design was implemented which saw the production of twenty beef patties with varying concentrations of fat (30%, 40%, 50%, 60% w/w) and salt (0.5%, 0.75%, 1.0%, 1.25%, 1.5% w/w). Twenty eight sausage were also produced with varying concentrations of fat (22.5%, 27.5%, 32.5%, 37.5% w/w) and salt (0.8%, 1%, 1.2%, 1.4%, 1.6%, 2%, 2.4% w/w). Finally, twenty different frankfurters formulations were produced with varying concentrations of fat (10%, 15%, 20%, 25% w/w) and salt (1%, 1.5%, 2%, 2.5%, 3% w/w). From these products it was found that the most consumer acceptable beef patty was that containing 40% fat with a salt level of 1%. This is a 20% decrease in fat and a 50% decrease in salt levels when compared to commercial patty available in Ireland and the UK. For sausages, salt reduced products were rated by the consumers as paler in colour, more tender and with greater meat flavour than higher salt containing products. The sausages containing 1.4 % and 1.0 % salt were significantly (P<0.01) found to be more acceptable to consumers than other salt levels. Frankfurter salt levels below 1.5% were shown to have a negative effect on consumer acceptability, with 2.5% salt concentration being the most accepted (P<0.001) by consumers. Samples containing less fat and salt were found to be tougher, less juicy and had greater cooking losses. Thus salt perception is very important for consumer acceptability, but fat levels can be potentially reduced without significantly affecting overall acceptability. Overall it can be summarised that the consumer acceptability of salt and fat reduced processed meats depends very much on the product and generalisations cannot be assumed. The study of bio-actives in processed meat products found that the reduced salt/fat patties fortified with CoQ10 were rated as more acceptable than commercially available products for beef patties. The reduced fat and salt, as well as the CoQ10 fortified, sausages were found to compare quite well to their commercial counterparts for overall acceptability, whereas commercial frankfurters were found to be the more favoured in comparison to reduced fat and CoQ10 fortified Frankfurters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Buried heat sources can be investigated by examining thermal infrared images and comparing these with the results of theoretical models which predict the thermal anomaly a given heat source may generate. Key factors influencing surface temperature include the geometry and temperature of the heat source, the surface meteorological environment, and the thermal conductivity and anisotropy of the rock. In general, a geothermal heat flux of greater than 2% of solar insolation is required to produce a detectable thermal anomaly in a thermal infrared image. A heat source of, for example, 2-300K greater than the average surface temperature must be a t depth shallower than 50m for the detection of the anomaly in a thermal infrared image, for typical terrestrial conditions. Atmospheric factors are of critical importance. While the mean atmospheric temperature has little significance, the convection is a dominant factor, and can act to swamp the thermal signature entirely. Given a steady state heat source that produces a detectable thermal anomaly, it is possible to loosely constrain the physical properties of the heat source and surrounding rock, using the surface thermal anomaly as a basis. The success of this technique is highly dependent on the degree to which the physical properties of the host rock are known. Important parameters include the surface thermal properties and thermal conductivity of the rock. Modelling of transient thermal situations was carried out, to assess the effect of time dependant thermal fluxes. One-dimensional finite element models can be readily and accurately applied to the investigation of diurnal heat flow, as with thermal inertia models. Diurnal thermal models of environments on Earth, the Moon and Mars were carried out using finite elements and found to be consistent with published measurements. The heat flow from an injection of hot lava into a near surface lava tube was considered. While this approach was useful for study, and long term monitoring in inhospitable areas, it was found to have little hazard warning utility, as the time taken for the thermal energy to propagate to the surface in dry rock (several months) in very long. The resolution of the thermal infrared imaging system is an important factor. Presently available satellite based systems such as Landsat (resolution of 120m) are inadequate for detailed study of geothermal anomalies. Airborne systems, such as TIMS (variable resolution of 3-6m) are much more useful for discriminating small buried heat sources. Planned improvements in the resolution of satellite based systems will broaden the potential for application of the techniques developed in this thesis. It is important to note, however, that adequate spatial resolution is a necessary but not sufficient condition for successful application of these techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cream liqueurs manufactured by a one-step process, where alcohol was added before homogenisation, were more stable than those processed by a two -step process which involved addition of alcohol after homogenisation. Using the one-step process, it was possible to produce creaming-stable liqueurs by using one pass through a homogeniser (27.6 MPa) equipped with "liquid whirl" valves. Test procedures to characterise cream liqueurs and to predict shelf life were studied in detail. A turbidity test proved simple, rapid and sensitive for characterising particle size and homogenisation efficiency. Prediction of age thickening/gelation in cream liqueurs during incubation at 45 °C depended on the age of the sample when incubated. Samples that gelled at 45 °C may not do so at ambient temperature. Commercial cream liqueurs were similar in gross chemical composition, and unlike experimentally produced liqueurs, these did not exhibit either age-gelation at ambient or elevated temperatures. Solutions of commercial sodium caseinates from different sources varied in their calcium sensitivity. When incorporated into cream liqueurs, caseinates influenced the rate of viscosity increase, coalescence and, possibly, gelation during incubated storage. Mild heat and alcohol treatment modified the properties of caseinate used to stabilise non-alcoholic emulsions, while the presence of alcohol in emulsions was important in preventing clustering of globules. The response to added trisodium citrate varied. In many cases, addition of the recommended level (0.18%) did not prevent gelation. Addition of small amounts of NaOH with 0.18 % trisodium citrate before homogenisation was beneficial. The stage at which citrate was added during processing was critical to the degree of viscosity increase (as opposed to gelation) in the product during 45 °C incubation. The component responsible for age-gelation was present in the milk-solids non fat portion of the cream and variations in the creams used were important in the age-gelation phenomenon Results indicated that, in addition to possibly Ca++, the micellar casein portion of serum may play a role in gelation. The role of the low molecular weight surfactants, sodium stearoyl lactylate and monodiglycerides in preventing gelation, was influenced by the presence of trisodium citrate. Clustering of fat globules and age-gelation were inhibited when 0.18 % citrate was included. Inclusion of sodium stearoyl lactylate, but not monodiglycerides, reduced the extent of viscosity increase at 45 °C in citrate containing liqueurs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concept of police accountability is not susceptible to a universal or concise definition. In the context of this thesis it is treated as embracing two fundamental components. First, it entails an arrangement whereby an individual, a minority and the whole community have the opportunity to participate meaningfully in the formulation of the principles and policies governing police operations. Second, it presupposes that those who have suffered as victims of unacceptable police behaviour should have an effective remedy. These ingredients, however, cannot operate in a vacuum. They must find an accommodation with the equally vital requirement that the burden of accountability should not be so demanding that the delivery of an effective police service is fatally impaired. While much of the current debate on police accountability in Britain and the USA revolves around the issue of where the balance should be struck in this accommodation, Ireland lacks the very foundation for such a debate as it suffers from a serious deficit in research and writing on police generally. This thesis aims to fill that gap by laying the foundations for an informed debate on police accountability and related aspects of police in Ireland. Broadly speaking the thesis contains three major interrelated components. The first is concerned with the concept of police in Ireland and the legal, constitutional and political context in which it operates. This reveals that although the Garda Siochana is established as a national force the legal prescriptions concerning its role and governance are very vague. Although a similar legislative format in Britain, and elsewhere, have been interpreted as conferring operational autonomy on the police it has not stopped successive Irish governments from exercising close control over the police. The second component analyses the structure and operation of the traditional police accountability mechanisms in Ireland; namely the law and the democratic process. It concludes that some basic aspects of the peculiar legal, constitutional and political structures of policing seriously undermine their capacity to deliver effective police accountability. In the case of the law, for example, the status of, and the broad discretion vested in, each individual member of the force ensure that the traditional legal actions cannot always provide redress where individuals or collective groups feel victimised. In the case of the democratic process the integration of the police into the excessively centralised system of executive government, coupled with the refusal of the Minister for Justice to accept responsibility for operational matters, project a barrier between the police and their accountability to the public. The third component details proposals on how the current structures of police accountability in Ireland can be strengthened without interfering with the fundamentals of the law, the democratic process or the legal and constitutional status of the police. The key elements in these proposals are the establishment of an independent administrative procedure for handling citizen complaints against the police and the establishment of a network of local police-community liaison councils throughout the country coupled with a centralised parliamentary committee on the police. While these proposals are analysed from the perspective of maximising the degree of police accountability to the public they also take into account the need to ensure that the police capacity to deliver an effective police service is not unduly impaired as a result.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis is concerned with an investigation of the anodic behaviour of ruthenium and iridium in aqueous solution and particularly of oxygen evolution on these metals. The latter process is of major interest in the large-scale production of hydrogen gas by the electrolysis of water. The presence of low levels of ruthenium trichloride ca. 10-4 mol dm-3 in acid solution give a considerable increase in the rate of oxygen evolution from platinum and gold, but not graphite, anodes. The mechanism of this catalytic effect was investigated using potential step and a.c. impedance technique. Earlier suggestions that the effect is due to catalysis by metal ions in solution were proved to be incorrect and it was shown that ruthenium species were incorporated into the surface oxide film. Changes in the oxidation state of these ruthenium species is probably responsible for the lowering of the oxygen overvoltage. Both the theoretical and practical aspects of the reaction were complicated by the fact that at constant potential the rates of both the catalysed and the uncatalysed oxygen evolution processes exhibit an appreciable, continuous decrease with either time or degree of oxidation of the substrate. The anodic behaviour of iridium in the oxide layer region has been investigated using conventional electrochemical techniques such as cyclic voltammetry. Applying a triangular voltage sweep at 10 Hz, 0.01 to 1.50V increases the amount of electric charge which the surface can store in the oxide region. This activation effect and the mechanism of charge storage is discussed in terms of both an expanded lattice theory for oxide growth on noble metals and a more recent theory of irreversible oxide formation with subsequent stoichiometry changes. The lack of hysteresis between the anodic and cathodic peaks at ca. 0.9 V suggests that the process involved here is proton migration in a relatively thick surface layer, i.e. that the reaction involved is some type of oxide-hydroxide transition. Lack of chloride ion inhibition in the anodic region also supports the irreversible oxide formation theory; however, to account for the hydrogen region of the potential sweep a compromise theory involving partial reduction of the outer regions of iridium oxide film is proposed. The loss of charge storage capacity when the activated iridium surface is anodized for a short time above ca. 1.60 V is attributed to loss by corrosion of the outer active layer from the metal surface. The behaviour of iridium at higher anodic potentials in acid solution was investigated. Current-time curves at constant potential and Tafel plots suggested that a change in the mechanism of the oxygen evolution reaction occurs at ca. 1.8 V. Above this potential, corrosion of the metal occurred, giving rise to an absorbance in the visible spectrum of the electrolyte (λ max = 455 nm). It is suggested that the species involved was Ir(O2)2+. A similar investigation in the case of alkaline electrolyte gave no evidence for a change in mechanism at 1.8 V and corrosion of the iridium was not observed. Oxygen evolution overpotentials were much lower for iridium than for platinum in both acidic and alkaline solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work examines the origins and early history of the Queen's College, Cork. Designedly there is as much stress on the origins as on the early history, for it is the contention of the work that the College was something more than a legislative mushroom. It was very much in the tradition of the civic universities which added an exciting new dimension to academic life in these islands in the nineteenth century. The first chapter surveys university practice and thinking at the opening of the century, relying exclusively on published sources. The second chapter is devoted specifically to the state of learning in Cork during the period, and makes extensive use of hitherto unpublished manuscript material in relation to the Royal Cork Institution. The third chapter deals with the highly significant evidence on education embodied in the Report of the Select Committee on Irish Education of 1838. This material has not previously been published. In chapter four an extended study is made of relevant letters in the manuscript correspondence of Sir Robert Peel - even the most recent authoritative biography has ignored this material. The remaining three chapters are devoted more specifically to the College, both in the formulation or policy and in its practical working. In chapter six there is an extended survey of early College life based exclusively on hitherto unpublished manuscript material in the College Archives. All of these sources, together with incidental published material, are set out at the end of each chapter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acute myeloid leukaemia (AML) is the most common form of acute leukaemia in adults. Its treatment has remained largely unchanged for the past 30 years. Chronic myeloid leukaemia (CML) represents a tremendous success story in the era of targeted therapy but significant challenges remain including the development of drug resistance and disease persistence due to presence of CML stem cells. The Aurora family of kinases is essential for cell cycle regulation and their aberrant expression in cancer prompted the development of small molecules that selectively inhibit their activity. Chapter 2 of this thesis outlines the efficacy and mechanism of action of alisertib, a novel inhibitor of Aurora A kinase, in preclinical models of CML. Alisertib possessed equipotent activity against CML cells expressing unmutated and mutated forms of BCR-ABL. Notably, this agent retained high activity against the T315I and E255K BCR-ABL mutations, which confer the greatest degree of resistance to standard CML therapy. Chapter 3 explores the activity of alisertib in preclinical models of AML. Alisertib disrupted cell viability, diminished clonogenic survival, induced expression of the forkhead box O3 (FOXO3a) targets p27 and BCL-2 interacting mediator (BIM), and triggered apoptosis. A link between Aurora A expression and sensitivity to ara-C was established. Chapter 4 outlines the role of the proto-oncogene serine/threonine-protein (PIM) kinases in resistance to ara-C in AML. We report that the novel small molecule PIM kinase inhibitor SGI-1776 disrupted cell viability and induced apoptosis in AML. We establish a link between ara-C resistance and PIM over-expression. Finally, chapter 5 explores how the preclinical work outlined in this thesis may be translated into clinical studies that may lead to novel therapeutic approaches for patients with refractory myeloid leukaemia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research asks the question: “What are the relational dynamics in Masters (MA) supervision?” It does so by focusing upon the supervisory relationship itself. It does this through dialoguing with the voices of both MA supervisors and supervisees in the Humanities using a Cultural Historical Activity Theory (CHAT) framework. In so doing, this research argues for a re-evaluation of how MA supervision is conceptualised and proposes a new theoretical framework for conceptualising MA supervision as a relational phenomenon. The research design was derived from an Activity Theory-influenced methodology. Data collection procedures included the administration of Activity Theory Logs, individual semi-structured interviews with both supervisors and supervisees and the completion of reflective journals. Grounded Theory was used to analyse the data. The sample for the study consists of three supervisor-supervisee dyads from three disciplines in the Humanities. Data was collected over the course of one academic year, 2010-2011. This research found that both individual and shared relational dynamics play an important role in MA supervision. Individual dynamics, such as supervisors’ iterative negotiation of ambiguity/clarity and supervisees’ boundary work, revealed that both parties attempt to negotiate a separation between their professional-academic identities and personal identities. However, an inherent paradox emerged when the shared relational dynamics of MA supervision were investigated. It was found that the shared space created by the supervisory relationship did not only exist in a physical setting, but was also psychoactive in nature and held strong emotional resonances for both parties involved. This served to undermine the separation between professional-academic and personal identities. As a result, this research argues that the interaction between the individual and shared relational dynamics in MA supervision enables, for both supervisors and supervisees, a disciplined improvisation of academic identity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The central research question that this thesis addresses is whether there is a significant gap between fishery stakeholder values and the principles and policy goals implicit in an Ecosystem Approach to Fisheries Management (EAFM). The implications of such a gap for fisheries governance are explored. Furthermore an assessment is made of what may be practically achievable in the implementation of an EAFM in fisheries in general and in a case study fishery in particular. The research was mainly focused on a particular case study, the Celtic Sea Herring fishery and its management committee, the Celtic Sea Herring Management Advisory Committee (CSHMAC). The Celtic Sea Herring fishery exhibits many aspects of an EAFM and the fish stock has successfully recovered to healthy levels in the past 5 years. However there are increasing levels of governance related conflict within the fishery which threaten the future sustainability of the stock. Previous research on EAFM governance has tended to focus either on higher levels of EAFM governance or on individual behaviour but very little research has attempted to link the two spheres or explore the relationship between them. Two main themes within this study aimed to address this gap. The first was what role governance could play in facilitating EAFM implementation. The second theme concerned the degree of convergence between high-level EAFM goals and stakeholder values. The first method applied was governance benchmarking to analyse systemic risks to EAFM implementation. This found that there are no real EU or national level policies which provide stakeholders or managers with clear targets for EAFM implementation. The second method applied was the use of cognitive mapping to explore stakeholders understandings of the main ecological, economic and institutional driving forces in the Celtic Sea Herring fishery. The main finding from this was that a long-term outlook can and has been incentivised through a combination of policy drivers and participatory management. However the fundamental principle of EAFM, accounting for ecosystem linkages rather than target stocks was not reflected in stakeholders cognitive maps. This was confirmed in a prioritisation of stakeholders management priorities using Analytic Hierarchy Process which found that the overriding concern is for protection of target stock status but that wider ecosystem health was not a priority for most management participants. The conclusion reached is that moving to sustainable fisheries may be a more complex process than envisioned in much of the literature and may consist of two phases. The first phase is a transition to a long-term but still target stock focused approach. This achievable transition is mainly a strategic change, which can be incentivised by policies and supported by stakeholders. In the Celtic Sea Herring fishery, and an increasing number of global and European fisheries, such transitions have contributed to successful stock recoveries. The second phase however, implementation of an ecosystem approach, may present a greater challenge in terms of governability, as this research highlights some fundamental conflicts between stakeholder perceptions and values and those inherent in an EAFM. This phase may involve the setting aside of fish for non-valued ecosystem elements and will require either a pronounced mind-set and value change or some strong top-down policy incentives in order to succeed. Fisheries governance frameworks will need to carefully explore the most effective balance between such endogenous and exogenous solutions. This finding of low prioritisation of wider ecosystem elements has implications for rights based management within an ecosystem approach, regardless of whether those rights are individual or collective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transition Year (TY) has been a feature of the Irish Education landscape for 39 years. Work experience (WE) has become a key component of TY. WE is defined as a module of between five and fifteen days duration where students engage in a work placement in the broader community. It places a major emphasis on building relationships between schools and their external communities and concomitantly between students and their potential future employers. Yet, the idea that participation in a TY work experience programme could facilitate an increased awareness of potential careers has drawn little attention from the research community. This research examines the influence WE has on the subsequent subjects choices made by students along with the effects of that experience on the students’ identities and emerging vocational identities. Socio-cultural Learning Theory and Occupational Choice Theory frame the overall study. A mixed methods approach to data collection was adopted through the administration of 323 quantitative questionnaires and 32 individual semi-structured interviews in three secondary schools. The analysis of the data was conducted using a grounded theory approach. The findings from the research show that WE makes a significant contribution to the students’ sense of agency in their own lives. It facilitates the otherwise complex process of subject choice, motivates students to work harder in their senior cycle, introduces them to the concepts of active, experience-based and self-directed learning, while boosting their self-confidence and nurturing the emergence of their personal and vocational identities. This research is a gateway to further study in this field. It also has wide reaching implications for students, teachers, school authorities, parents and policy makers regarding teaching and learning in our schools and the value of learning beyond the walls of the classroom.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim: Diabetes is an important barometer of health system performance. This chronic condition is a source of significant morbidity, premature mortality and a major contributor to health care costs. There is an increasing focus internationally, and more recently nationally, on system, practice and professional-level initiatives to promote the quality of care. The aim of this thesis was to investigate the ‘quality chasm’ around the organisation and delivery of diabetes care in general practice, to explore GPs’ attitudes to engaging in quality improvement activities and to examine efforts to improve the quality of diabetes care in Ireland from practice to policy. Methods: Quantitative and qualitative methods were used. As part of a mixed methods sequential design, a postal survey of 600 GPs was conducted to assess the organization of care. This was followed by an in-depth qualitative study using semi-structured interviews with a purposive sample of 31 GPs from urban and rural areas. The qualitative methodology was also used to examine GPs’ attitudes to engaging in quality improvement. Data were analysed using a Framework approach. A 2nd observation study was used to assess the quality of care in 63 practices with a special interest in diabetes. Data on 3010 adults with Type 2 diabetes from 3 primary care initiatives were analysed and the results were benchmarked against national guidelines and standards of care in the UK. The final study was an instrumental case study of policy formulation. Semi-structured interviews were conducted with 15 members of the Expert Advisory Group (EAG) for Diabetes. Thematic analysis was applied to the data using 3 theories of the policy process as analytical tools. Results: The survey response rate was 44% (n=262). Results suggested care delivery was largely unstructured; 45% of GPs had a diabetes register (n=157), 53% reported using guidelines (n=140), 30% had formal call recall system (n=78) and 24% had none of these organizational features (n=62). Only 10% of GPs had a formal shared protocol with the local hospital specialist diabetes team (n=26). The lack of coordination between settings was identified as a major barrier to providing optimal care leading to waiting times, overburdened hospitals and avoidable duplication. The lack of remuneration for chronic disease management had a ripple effect also creating costs for patients and apathy among GPs. There was also a sense of inertia around quality improvement activities particularly at a national level. This attitude was strongly influenced by previous experiences of change in the health system. In contrast GP’s spoke positively about change at a local level which was facilitated by a practice ethos, leadership and special interest in diabetes. The 2nd quantitative study found that practices with a special interest in diabetes achieved a standard of care comparable to the UK in terms of the recording of clinical processes of care and the achievement of clinical targets; 35% of patients reached the HbA1c target of <6.5% compared to 26% in England and Wales. With regard to diabetes policy formulation, the evolving process of action and inaction was best described by the Multiple Streams Theory. Within the EAG, the formulation of recommendations was facilitated by overarching agreement on the “obvious” priorities while the details of proposals were influenced by personal preferences and local capacity. In contrast the national decision-making process was protracted and ambiguous. The lack of impetus from senior management coupled with the lack of power conferred on the EAG impeded progress. Conclusions: The findings highlight the inconsistency of diabetes care in Ireland. The main barriers to optimal diabetes management center on the organization and coordination of care at the systems level with consequences for practice, providers and patients. Quality improvement initiatives need to stimulate a sense of ownership and interest among frontline service providers to address the local sense of inertia to national change. To date quality improvement in diabetes care has been largely dependent the “special interest” of professionals. The challenge for the Irish health system is to embed this activity as part of routine practice, professional responsibility and the underlying health care culture.