81 resultados para leave to deliver interrogatories


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective The colonic microbiota ferment dietary fibres, producing short chain fatty acids. Recent evidence suggests that the short chain fatty acid propionate may play an important role in appetite regulation. We hypothesised that colonic delivery of propionate would increase peptide YY (PYY) and glucagon like peptide-1 (GLP-1) secretion in humans, and reduce energy intake and weight gain in overweight adults. Design To investigate whether propionate promotes PYY and GLP-1 secretion, a primary cultured human colonic cell model was developed. To deliver propionate specifically to the colon, we developed a novel inulin-propionate ester. An acute randomised, controlled cross-over study was used to assess the effects of this inulin-propionate ester on energy intake and plasma PYY and GLP-1 concentrations. The long-term effects of inulin-propionate ester on weight gain were subsequently assessed in a randomised, controlled 24-week study involving 60 overweight adults. Results Propionate significantly stimulated the release of PYY and GLP-1 from human colonic cells. Acute ingestion of 10 g inulin-propionate ester significantly increased postprandial plasma PYY and GLP-1 and reduced energy intake. Over 24 weeks, 10 g/day inulin-propionate ester supplementation significantly reduced weight gain, intra-abdominal adipose tissue distribution, intrahepatocellular lipid content and prevented the deterioration in insulin sensitivity observed in the inulin-control group. Conclusions These data demonstrate for the first time that increasing colonic propionate prevents weight gain in overweight adult humans

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wind generation's contribution to supporting peak electricity demand is one of the key questions in wind integration studies. Differently from conventional units, the available outputs of different wind farms cannot be approximated as being statistically independent, and hence near-zero wind output is possible across an entire power system. This paper will review the risk model structures currently used to assess wind's capacity value, along with discussion of the resulting data requirements. A central theme is the benefits from performing statistical estimation of the joint distribution for demand and available wind capacity, focusing attention on uncertainties due to limited histories of wind and demand data; examination of Great Britain data from the last 25 years shows that the data requirements are greater than generally thought. A discussion is therefore presented into how analysis of the types of weather system which have historically driven extreme electricity demands can help to deliver robust insights into wind's contribution to supporting demand, even in the face of such data limitations. The role of the form of the probability distribution for available conventional capacity in driving wind capacity credit results is also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A university degree is effectively a prerequisite for entering the archaeological workforce in the UK. Archaeological employers consider that new entrants to the profession are insufficiently skilled, and hold university training to blame. But university archaeology departments do not consider it their responsibility to deliver fully formed archaeological professionals, but rather to provide an education that can then be applied in different workplaces, within and outside archaeology. The number of individuals studying archaeology at university exceeds the total number working in professional practice, with many more new graduates emerging than archaeological jobs advertised annually. Over-supply of practitioners is also a contributing factor to low pay in archaeology. Steps are being made to provide opportunities for vocational training, both within and outside the university system, but archaeological training and education within the universities and subsequently the archaeological labour market may be adversely impacted upon by the introduction of variable top-up student fees.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

More data will be produced in the next five years than in the entire history of human kind, a digital deluge that marks the beginning of the Century of Information. Through a year-long consultation with UK researchers, a coherent strategy has been developed, which will nurture Century-of-Information Research (CIR); it crystallises the ideas developed by the e-Science Directors' Forum Strategy Working Group. This paper is an abridged version of their latest report which can be found at: http://wikis.nesc.ac.uk/escienvoy/Century_of_Information_Research_Strategy which also records the consultation process and the affiliations of the authors. This document is derived from a paper presented at the Oxford e-Research Conference 2008 and takes into account suggestions made in the ensuing panel discussion. The goals of the CIR Strategy are to facilitate the growth of UK research and innovation that is data and computationally intensive and to develop a new culture of 'digital-systems judgement' that will equip research communities, businesses, government and society as a whole, with the skills essential to compete and prosper in the Century of Information. The CIR Strategy identifies a national requirement for a balanced programme of coordination, research, infrastructure, translational investment and education to empower UK researchers, industry, government and society. The Strategy is designed to deliver an environment which meets the needs of UK researchers so that they can respond agilely to challenges, can create knowledge and skills, and can lead new kinds of research. It is a call to action for those engaged in research, those providing data and computational facilities, those governing research and those shaping education policies. The ultimate aim is to help researchers strengthen the international competitiveness of the UK research base and increase its contribution to the economy. The objectives of the Strategy are to better enable UK researchers across all disciplines to contribute world-leading fundamental research; to accelerate the translation of research into practice; and to develop improved capabilities, facilities and context for research and innovation. It envisages a culture that is better able to grasp the opportunities provided by the growing wealth of digital information. Computing has, of course, already become a fundamental tool in all research disciplines. The UK e-Science programme (2001-06)—since emulated internationally—pioneered the invention and use of new research methods, and a new wave of innovations in digital-information technologies which have enabled them. The Strategy argues that the UK must now harness and leverage its own, plus the now global, investment in digital-information technology in order to spread the benefits as widely as possible in research, education, industry and government. Implementing the Strategy would deliver the computational infrastructure and its benefits as envisaged in the Science & Innovation Investment Framework 2004-2014 (July 2004), and in the reports developing those proposals. To achieve this, the Strategy proposes the following actions: support the continuous innovation of digital-information research methods; provide easily used, pervasive and sustained e-Infrastructure for all research; enlarge the productive research community which exploits the new methods efficiently; generate capacity, propagate knowledge and develop skills via new curricula; and develop coordination mechanisms to improve the opportunities for interdisciplinary research and to make digital-infrastructure provision more cost effective. To gain the best value for money strategic coordination is required across a broad spectrum of stakeholders. A coherent strategy is essential in order to establish and sustain the UK as an international leader of well-curated national data assets and computational infrastructure, which is expertly used to shape policy, support decisions, empower researchers and to roll out the results to the wider benefit of society. The value of data as a foundation for wellbeing and a sustainable society must be appreciated; national resources must be more wisely directed to the collection, curation, discovery, widening access, analysis and exploitation of these data. Every researcher must be able to draw on skills, tools and computational resources to develop insights, test hypotheses and translate inventions into productive use, or to extract knowledge in support of governmental decision making. This foundation plus the skills developed will launch significant advances in research, in business, in professional practice and in government with many consequent benefits for UK citizens. The Strategy presented here addresses these complex and interlocking requirements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate science is coming under increasing pressure to deliver projections of future climate change at spatial scales as small as a few kilometres for use in impacts studies. But is our understanding and modelling of the climate system advanced enough to offer such predictions? Here we focus on the Atlantic–European sector, and on the effects of greenhouse gas forcing on the atmospheric and, to a lesser extent, oceanic circulations. We review the dynamical processes which shape European climate and then consider how each of these leads to uncertainty in the future climate. European climate is unique in many regards, and as such it poses a unique challenge for climate prediction. Future European climate must be considered particularly uncertain because (i) the spread between the predictions of current climate models is still considerable and (ii) Europe is particularly strongly affected by several processes which are known to be poorly represented in current models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and Purpose-Clinical research into the treatment of acute stroke is complicated, is costly, and has often been unsuccessful. Developments in imaging technology based on computed tomography and magnetic resonance imaging scans offer opportunities for screening experimental therapies during phase II testing so as to deliver only the most promising interventions to phase III. We discuss the design and the appropriate sample size for phase II studies in stroke based on lesion volume. Methods-Determination of the relation between analyses of lesion volumes and of neurologic outcomes is illustrated using data from placebo trial patients from the Virtual International Stroke Trials Archive. The size of an effect on lesion volume that would lead to a clinically relevant treatment effect in terms of a measure, such as modified Rankin score (mRS), is found. The sample size to detect that magnitude of effect on lesion volume is then calculated. Simulation is used to evaluate different criteria for proceeding from phase II to phase III. Results-The odds ratios for mRS correspond roughly to the square root of odds ratios for lesion volume, implying that for equivalent power specifications, sample sizes based on lesion volumes should be about one fourth of those based on mRS. Relaxation of power requirements, appropriate for phase II, lead to further sample size reductions. For example, a phase III trial comparing a novel treatment with placebo with a total sample size of 1518 patients might be motivated from a phase II trial of 126 patients comparing the same 2 treatment arms. Discussion-Definitive phase III trials in stroke should aim to demonstrate significant effects of treatment on clinical outcomes. However, more direct outcomes such as lesion volume can be useful in phase II for determining whether such phase III trials should be undertaken in the first place. (Stroke. 2009;40:1347-1352.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new approach of employing metal particles in micelles for the hydrogenation of organic molecules in the presence of fluorinated surfactant and water in supercritical carbon dioxide has very recently been introduced. This is allegedly to deliver many advantages for carrying out catalysis including the use of supercritical carbon dioxide (scCO(2)) as a greener solvent. Following this preliminary account, the present work aims to provide direct visual evidence on the formation of metal microemulsions and to investigate whether metal located in the soft micellar assemblies could affect reaction selectivity. Synthesis of Pd nanoparticles in perfluorohydrocarboxylate anionic micelles in scCO(2) is therefore carried out in a stainless steel batch reactor at 40 degreesC and in a 150 bar CO2/H-2 mixture. Homogeneous dispersion of the microemulsion containing Pd nanoparticles in scCO(2) is observed through a sapphire window reactor at W-0 ratios (molar water-to-surfactant ratios) ranging from 2 to 30. It is also evidenced that the use of micelle assemblies as new metal catalyst nanocarriers could indeed exert a great influence on product selectivity. The hydrogenation of a citral molecule that contains three reducible groups (aldehyde, double bonds at the 2,3-position and the 6,7-position) is studied. An unusually high selectivity toward citronellal (a high regioselectivity toward the reduction of the 2,3-unsaturation) is observed in supercritical carbon dioxide. On the other hand, when the catalysis is carried out in the conventional liquid or vapor phase over the same reaction time, total hydrogenation of the two double bonds is achieved. It is thought that the high kinetic reluctance for double bond hydrogenation of the citral molecule at the hydrophobic end (the 6,7-position) is due to the unique micelle environment that is in close proximity to the metal surface in supercritical carbon dioxide that guides a head-on attack of the molecule toward the core metal particle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objectives of this paper are to: firstly, identify key issues related to sustainable intelligent buildings (environmental, social, economic and technological factors); develop a conceptual model for the selection of the appropriate KPIs; secondly, test critically stakeholder's perceptions and values of selected KPIs intelligent buildings; and thirdly develop a new model for measuring the level of sustainability for sustainable intelligent buildings. This paper uses a consensus-based model (Sustainable Built Environment Tool- SuBETool), which is analysed using the analytical hierarchical process (AHP) for multi-criteria decision-making. The use of the multi-attribute model for priority setting in the sustainability assessment of intelligent buildings is introduced. The paper commences by reviewing the literature on sustainable intelligent buildings research and presents a pilot-study investigating the problems of complexity and subjectivity. This study is based upon a survey perceptions held by selected stakeholders and the value they attribute to selected KPIs. It is argued that the benefit of the new proposed model (SuBETool) is a ‘tool’ for ‘comparative’ rather than an absolute measurement. It has the potential to provide useful lessons from current sustainability assessment methods for strategic future of sustainable intelligent buildings in order to improve a building's performance and to deliver objective outcomes. Findings of this survey enrich the field of intelligent buildings in two ways. Firstly, it gives a detailed insight into the selection of sustainable building indicators, as well as their degree of importance. Secondly, it tesst critically stakeholder's perceptions and values of selected KPIs intelligent buildings. It is concluded that the priority levels for selected criteria is largely dependent on the integrated design team, which includes the client, architects, engineers and facilities managers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The built environment in which health and social care is delivered can have an impact on the efficiency and outcomes of care processes. The health-care estate is large and growing and is expensive to build, adapt and maintain. The design of these buildings is a complex, difficult and political process. Better use of care pathways as an input to the design and use of the built environment has the potential to deliver significant benefits. A number of variations on the idea of care pathways are already used in designing health-care buildings but this is under-researched. This paper provides a framework for thinking about care pathways and the health-care built environment. The framework distinguishes between five different pathway ‘types’ defined for the purpose of understanding the relationship between pathways and infrastructure. The five types are: ‘care pathways’, ‘integrated care pathways’, ‘patient pathways’, ‘patient journeys’ and ‘patient flows’. The built environment implications of each type are discussed and recommendations made for those involved in either building development or care pathway projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The built environment in which health and social care is delivered can have an impact on the efficiency and outcomes of care processes. The health-care estate is large and growing and is expensive to build, adapt and maintain. The design of these buildings is a complex, difficult and political process. Better use of care pathways as an input to the design and use of the built environment has the potential to deliver significant benefits. A number of variations on the idea of care pathways are already used in designing health-care buildings but this is under-researched. This paper provides a framework for thinking about care pathways and the health-care built environment. The framework distinguishes between five different pathway ‘types’ defined for the purpose of understanding the relationship between pathways and infrastructure. The five types are: ‘care pathways’, ‘integrated care pathways’, ‘patient pathways’, ‘patient journeys’ and ‘patient flows’. The built environment implications of each type are discussed and recommendations made for those involved in either building development or care pathway projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Geological carbon dioxide storage (CCS) has the potential to make a significant contribution to the decarbonisation of the UK. Amid concerns over maintaining security, and hence diversity, of supply, CCS could allow the continued use of coal, oil and gas whilst avoiding the CO2 emissions currently associated with fossil fuel use. This project has explored some of the geological, environmental, technical, economic and social implications of this technology. The UK is well placed to exploit CCS with a large offshore storage capacity, both in disused oil and gas fields and saline aquifers. This capacity should be sufficient to store CO2 from the power sector (at current levels) for a least one century, using well understood and therefore likely to be lower-risk, depleted hydrocarbon fields and contained parts of aquifers. It is very difficult to produce reliable estimates of the (potentially much larger) storage capacity of the less well understood geological reservoirs such as non-confined parts of aquifers. With the majority of its large coal fired power stations due to be retired during the next 15 to 20 years, the UK is at a natural decision point with respect to the future of power generation from coal; the existence of both national reserves and the infrastructure for receiving imported coal makes clean coal technology a realistic option. The notion of CCS as a ‘bridging’ or ‘stop-gap’ technology (i.e. whilst we develop ‘genuinely’ sustainable renewable energy technologies) needs to be examined somewhat critically, especially given the scale of global coal reserves. If CCS plant is built, then it is likely that technological innovation will bring down the costs of CO2 capture, such that it could become increasingly attractive. As with any capitalintensive option, there is a danger of becoming ‘locked-in’ to a CCS system. The costs of CCS in our model for UK power stations in the East Midlands and Yorkshire to reservoirs in the North Sea are between £25 and £60 per tonne of CO2 captured, transported and stored. This is between about 2 and 4 times the current traded price of a tonne of CO2 in the EU Emissions Trading Scheme. In addition to the technical and economic requirements of the CCS technology, it should also be socially and environmentally acceptable. Our research has shown that, given an acceptance of the severity and urgency of addressing climate change, CCS is viewed favourably by members of the public, provided it is adopted within a portfolio of other measures. The most commonly voiced concern from the public is that of leakage and this remains perhaps the greatest uncertainty with CCS. It is not possible to make general statements concerning storage security; assessments must be site specific. The impacts of any potential leakage are also somewhat uncertain but should be balanced against the deleterious effects of increased acidification in the oceans due to uptake of elevated atmospheric CO2 that have already been observed. Provided adequate long term monitoring can be ensured, any leakage of CO2 from a storage site is likely to have minimal localised impacts as long as leaks are rapidly repaired. A regulatory framework for CCS will need to include risk assessment of potential environmental and health and safety impacts, accounting and monitoring and liability for the long term. In summary, although there remain uncertainties to be resolved through research and demonstration projects, our assessment demonstrates that CCS holds great potential for significant cuts in CO2 emissions as we develop long term alternatives to fossil fuel use. CCS can contribute to reducing emissions of CO2 into the atmosphere in the near term (i.e. peak-shaving the future atmospheric concentration of CO2), with the potential to continue to deliver significant CO2 reductions over the long term.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An overtly critical perspective on 're-engineering construction' is presented. It is contended that re-engineering is impossible to define in terms of its substantive content and is best understood as a rhetorical label. In recent years, the language of re-engineering has heavily shaped the construction research agenda. The declared goals are to lower costs and improve value for the customer. The discourse is persuasive because it reflects the ideology of the 'enterprise culture' and the associated rhetoric of customer responsiveness. Re-engineering is especially attractive to the construction industry because it reflects and reinforces the existing dominant way of thinking. The overriding tendency is to reduce organizational complexities to a mechanistic quest for efficiency. Labour is treated as a commodity. Within this context, the objectives of re-engineering become 'common sense'. Knowledge becomes subordinate to the dominant ideology of neo-liberalism. The accepted research agenda for re-engineering construction exacerbates the industry's problems and directly contributes to the casualization of the workforce. The continued adherence to machine metaphors by the construction industry's top management has directly contributed to the 'bad attitudes' and 'adversarial culture' that they repeatedly decry. Supposedly neutral topics such as pre-assembly, partnering, supply chain management and lean thinking serve only to justify the shift towards bogus labour-only subcontracting and the associated reduction of employment rights. The continued casualization of the workforce raises real questions about the industry's future capacity to deliver high-quality construction. In order to appear 'relevant' to the needs of industry, it seems that the research community is doomed to perpetuate this regressive cycle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The inaugural meeting of the International Scientific Association for Probiotics and Prebiotics (ISAPP) was held May 3 to May 5 2002 in London, Ontario, Canada. A group of 63 academic and industrial scientists from around the world convened to discuss current issues in the science of probiotics and prebiotics. ISAPP is a non-profit organization comprised of international scientists whose intent is to strongly support and improve the levels of scientific integrity and due diligence associated with the study, use, and application of probiotics and prebiotics. In addition, ISAPP values its role in facilitating communication with the public and healthcare providers and among scientists in related fields on all topics pertinent to probiotics and prebiotics. It is anticipated that such efforts will lead to development of approaches and products that are optimally designed for the improvement of human and animal health and well being. This article is a summary of the discussions, conclusions, and recommendations made by 8 working groups convened during the first ISAPP workshop focusing on the topics of: definitions, intestinal flora, extra-intestinal sites, immune function, intestinal disease, cancer, genetics and genomics, and second generation prebiotics. Humans have evolved in symbiosis with an estimated 1014 resident microorganisms. However, as medicine has widely defined and explored the perpetrators of disease, including those of microbial origin, it has paid relatively little attention to the microbial cells that constitute the most abundant life forms associated with our body. Microbial metabolism in humans and animals constitutes an intense biochemical activity in the body, with profound repercussions for health and disease. As understanding of the human genome constantly expands, an important opportunity will arise to better determine the relationship between microbial populations within the body and host factors (including gender, genetic background, and nutrition) and the concomitant implications for health and improved quality of life. Combined human and microbial genetic studies will determine how such interactions can affect human health and longevity, which communication systems are used, and how they can be influenced to benefit the host. Probiotics are defined as live microorganisms which, when administered in adequate amounts confer a health benefit on the host.1 The probiotic concept dates back over 100 years, but only in recent times have the scientific knowledge and tools become available to properly evaluate their effects on normal health and well being, and their potential in preventing and treating disease. A similar situation exists for prebiotics, defined by this group as non-digestible substances that provide a beneficial physiological effect on the host by selectively stimulating the favorable growth or activity of a limited number of indigenous bacteria. Prebiotics function complementary to, and possibly synergistically with, probiotics. Numerous studies are providing insights into the growth and metabolic influence of these microbial nutrients on health. Today, the science behind the function of probiotics and prebiotics still requires more stringent deciphering both scientifically and mechanistically. The explosion of publications and interest in probiotics and prebiotics has resulted in a body of collective research that points toward great promise. However, this research is spread among such a diversity of organisms, delivery vehicles (foods, pills, and supplements), and potential health targets such that general conclusions cannot easily be made. Nevertheless, this situation is rapidly changing on a number of important fronts. With progress over the past decade on the genetics of lactic acid bacteria and the recent, 2,3 and pending, 4 release of complete genome sequences for major probiotic species, the field is now armed with detailed information and sophisticated microbiological and bioinformatic tools. Similarly, advances in biotechnology could yield new probiotics and prebiotics designed for enhanced or expanded functionality. The incorporation of genetic tools within a multidisciplinary scientific platform is expected to reveal the contributions of commensals, probiotics, and prebiotics to general health and well being and explicitly identify the mechanisms and corresponding host responses that provide the basis for their positive roles and associated claims. In terms of human suffering, the need for effective new approaches to prevent and treat disease is paramount. The need exists not only to alleviate the significant mortality and morbidity caused by intestinal diseases worldwide (especially diarrheal diseases in children), but also for infections at non-intestinal sites. This is especially worthy of pursuit in developing nations where mortality is too often the outcome of food and water borne infection. Inasmuch as probiotics and prebiotics are able to influence the populations or activities of commensal microflora, there is evidence that they can also play a role in mitigating some diseases. 5,6 Preliminary support that probiotics and prebiotics may be useful as intervention in conditions including inflammatory bowel disease, irritable bowel syndrome, allergy, cancer (especially colorectal cancer of which 75% are associated with diet), vaginal and urinary tract infections in women, kidney stone disease, mineral absorption, and infections caused by Helicobacter pylori is emerging. Some metabolites of microbes in the gut may also impact systemic conditions ranging from coronary heart disease to cognitive function, suggesting the possibility that exogenously applied microbes in the form of probiotics, or alteration of gut microecology with prebiotics, may be useful interventions even in these apparently disparate conditions. Beyond these direct intervention targets, probiotic cultures can also serve in expanded roles as live vehicles to deliver biologic agents (vaccines, enzymes, and proteins) to targeted locations within the body. The economic impact of these disease conditions in terms of diagnosis, treatment, doctor and hospital visits, and time off work exceeds several hundred billion dollars. The quality of life impact is also of major concern. Probiotics and prebiotics offer plausible opportunities to reduce the morbidity associated with these conditions. The following addresses issues that emerged from 8 workshops (Definitions, Intestinal Flora, Extra-Intestinal Sites, Immune Function, Intestinal Disease, Cancer, Genomics, and Second Generation Prebiotics), reflecting the current scientific state of probiotics and prebiotics. This is not a comprehensive review, however the study emphasizes pivotal knowledge gaps, and recommendations are made as to the underlying scientific and multidisciplinary studies that will be required to advance our understanding of the roles and impact of prebiotics, probiotics, and the commensal microflora upon health and disease management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Excessive exposure to uv light initiates melanoma in the skin. Tumour-specific enzymes are hijacked to deliver anticancer drugs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using liposomes to deliver drugs to and through human skin is controversial, as their function varies with type and composition. Thus they may act as drug carriers controlling release of the medicinal agent. Alternatively, they may provide a localized depot in the skin so minimizing systemic effects or can be used for targeting delivery to skin appendages (hair follicles and sweat glands). Liposomes may also enhance transdermal drug delivery, increasing systemic drug concentrations. With such a multiplicity of functions, it is not surprising that mechanisms of liposomal delivery of therapeutic agents to and through the skin are unclear. Accordingly, this article provides an overview of the modes and mechanisms of action of different vesicles as drug delivery vectors in human skin. Our conclusion is that vesicles, depending on the composition and method of preparation, can vary with respect to size, lamellarity, charge, membrane fluidity or elasticity and drug entrapment. This variability allows for multiple functions ranging from local to transdermal effects. Application to dissimilar skins (animal or human) via diverse protocols may reveal different mechanisms of action with possible vesicle skin penetration reaching different depths, from surface assimilation to (rarely) the viable tissue and subsequent systemic absorption.