17 resultados para Explosion
em CentAUR: Central Archive University of Reading - UK
Resumo:
AEA Technology has provided an assessment of the probability of α-mode containment failure for the Sizewell B PWR. After a preliminary review of the methodologies available it was decided to use the probabilistic approach described in the paper, based on an extension of the methodology developed by Theofanous et al. (Nucl. Sci. Eng. 97 (1987) 259–325). The input to the assessment is 12 probability distributions; the bases for the quantification of these distributions are discussed. The α-mode assessment performed for the Sizewell B PWR has demonstrated the practicality of the event-tree method with input data represented by probability distributions. The assessment itself has drawn attention to a number of topics, which may be plant and sequence dependent, and has indicated the importance of melt relocation scenarios. The α-mode failure probability following an accident that leads to core melt relocation to the lower head for the Sizewell B PWR has been assessed as a few parts in 10 000, on the basis of current information. This assessment has been the first to consider elevated pressures (6 MPa and 15 MPa) besides atmospheric pressure, but the results suggest only a modest sensitivity to system pressure.
Resumo:
Various methods of assessment have been applied to the One Dimensional Time to Explosion (ODTX) apparatus and experiments with the aim of allowing an estimate of the comparative violence of the explosion event to be made. Non-mechanical methods used were a simple visual inspection, measuring the increase in the void volume of the anvils following an explosion and measuring the velocity of the sound produced by the explosion over 1 metre. Mechanical methods used included monitoring piezo-electric devices inserted in the frame of the machine and measuring the rotational velocity of a rotating bar placed on the top of the anvils after it had been displaced by the shock wave. This last method, which resembles original Hopkinson Bar experiments, seemed the easiest to apply and analyse, giving relative rankings of violence and the possibility of the calculation of a “detonation” pressure.
Resumo:
A One-Dimensional Time to Explosion (ODTX) apparatus has been used to study the times to explosion of a number of compositions based on RDX and HMX over a range of contact temperatures. The times to explosion at any given temperature tend to increase from RDX to HMX and with the proportion of HMX in the composition. Thermal ignition theory has been applied to time to explosion data to calculate kinetic parameters. The apparent activation energy for all of the compositions lay between 127 kJ mol−1 and 146 kJ mol−1. There were big differences in the pre-exponential factor and this controlled the time to explosion rather than the activation energy for the process.
Resumo:
The storage and processing capacity realised by computing has lead to an explosion of data retention. We now reach the point of information overload and must begin to use computers to process more complex information. In particular, the proposition of the Semantic Web has given structure to this problem, but has yet realised practically. The largest of its problems is that of ontology construction; without a suitable automatic method most will have to be encoded by hand. In this paper we discus the current methods for semi and fully automatic construction and their current shortcomings. In particular we pay attention the application of ontologies to products and the particle application of the ontologies.
Resumo:
The recent explosion of interest in the archaeology of warfare is examined, and some possible reasons behind this trend are explored. Characteristics in the archaeology of warfare are identified in relation to prehistoric and historical archaeology and their contrasting sources of evidence. The androcentric tendency of the archaeology of warfare is discussed, and the major themes of the volume are introduced, including memorial landscapes, commemorative monuments and their conflicting meanings, and the social context of warfare.
Resumo:
Fundamental nutrition seeks to describe the complex biochemical reactions involved in assimilation and processing of nutrients by various tissues and organs, and to quantify nutrient movement (flux) through those processes. Over the last 25 yr, considerable progress has been made in increasing our understanding of metabolism in dairy cattle. Major advances have been made at all levels of biological organization, including the whole animal, organ systems, tissues, cells, and molecules. At the whole-animal level, progress has been made in delineating metabolism during late pregnancy and the transition to lactation, as well as in whole-body use of energy-yielding substrates and amino acids for growth in young calves. An explosion of research using multicatheterization techniques has led to better quantitative descriptions of nutrient use by tissues of the portal-drained viscera (digestive tract, pancreas, and associated adipose tissues) and liver. Isolated tissue preparations have provided important information on the interrelationships among glucose, fatty acid, and amino acid metabolism in liver, adipose tissue, and mammary gland, as well as the regulation of these pathways during different physiological states. Finally, the last 25 yr has witnessed the birth of "molecular biology" approaches to understanding fundamental nutrition. Although measurements of mRNA abundance for proteins of interest already have provided new insights into regulation of metabolism, the next 25 yr will likely see remarkable advances as these techniques continue to be applied to problems of dairy cattle biology. Integration of the "omics" technologies (functional genomics, proteomics, and metabolomics) with measurements of tissue metabolism obtained by other methods is a particularly exciting prospect for the future. The result should be improved animal health and well being, more efficient dairy production, and better models to predict nutritional requirements and provide rations to meet those requirements.
Resumo:
We are experiencing an explosion of knowledge with relevance to conserving biodiversity and protecting the environment necessary to sustain life on earth. Many science disciplines are involved in generating this ne, knowledge and real progress can be made when scientists collaborate across disciplines to generate both macro- and micro-environmental knowledge and then communicate and interact with specialists in sociology, economics and public policy. An important requirement is that the often complex scientific concepts and their voluminous supporting data are managed in such ways as to make them accessible across the many specializations involved. Horticultural science has much to contribute to the knowledge base for environmental conservation. While it seems that production horticulture has been slow to embrace knowledge and concepts that would reduce the heavy reliance on agricultural chemicals, the use of peat as a growing medium, and lead to more sustainable use of water and other resources, environmental horticulture is providing valuable opportunities to rescue or protect endangered species, educate the public about plants and plant science, and demonstrate environmental stewardship and sustainable production practices. Likewise, social horticulture is drawing, attention to the many contributions of horticultural foods and parks and gardens to human health and welfare. Overall, horticulture has a vital role to play in integrating, knowledge from other scientific, social, economic and political disciplines.
Resumo:
Nutrigenetics and personalised nutrition are components of the concept that in the future genotyping will be used as a means of defining dietary recommendations to suit the individual. Over the last two decades there has been an explosion of research in this area, with often conflicting findings reported in the literature. Reviews of the literature in the area of apoE genotype and cardiovascular health, apoA5 genotype and postprandial lipaemia and perilipin and adiposity are used to demonstrate the complexities of genotype-phenotype associations and the aetiology of apparent between-study inconsistencies in the significance and size of effects. Furthermore, genetic research currently often takes a very reductionist approach, examining the interactions between individual genotypes and individual disease biomarkers and how they are modified by isolated dietary components or foods. Each individual possesses potentially hundreds of 'at-risk' gene variants and consumes a highly-complex diet. In order for nutrigenetics to become a useful public health tool, there is a great need to use mathematical and bioinformatic tools to develop strategies to examine the combined impact of multiple gene variants on a range of health outcomes and establish how these associations can be modified using combined dietary strategies.
Resumo:
The inaugural meeting of the International Scientific Association for Probiotics and Prebiotics (ISAPP) was held May 3 to May 5 2002 in London, Ontario, Canada. A group of 63 academic and industrial scientists from around the world convened to discuss current issues in the science of probiotics and prebiotics. ISAPP is a non-profit organization comprised of international scientists whose intent is to strongly support and improve the levels of scientific integrity and due diligence associated with the study, use, and application of probiotics and prebiotics. In addition, ISAPP values its role in facilitating communication with the public and healthcare providers and among scientists in related fields on all topics pertinent to probiotics and prebiotics. It is anticipated that such efforts will lead to development of approaches and products that are optimally designed for the improvement of human and animal health and well being. This article is a summary of the discussions, conclusions, and recommendations made by 8 working groups convened during the first ISAPP workshop focusing on the topics of: definitions, intestinal flora, extra-intestinal sites, immune function, intestinal disease, cancer, genetics and genomics, and second generation prebiotics. Humans have evolved in symbiosis with an estimated 1014 resident microorganisms. However, as medicine has widely defined and explored the perpetrators of disease, including those of microbial origin, it has paid relatively little attention to the microbial cells that constitute the most abundant life forms associated with our body. Microbial metabolism in humans and animals constitutes an intense biochemical activity in the body, with profound repercussions for health and disease. As understanding of the human genome constantly expands, an important opportunity will arise to better determine the relationship between microbial populations within the body and host factors (including gender, genetic background, and nutrition) and the concomitant implications for health and improved quality of life. Combined human and microbial genetic studies will determine how such interactions can affect human health and longevity, which communication systems are used, and how they can be influenced to benefit the host. Probiotics are defined as live microorganisms which, when administered in adequate amounts confer a health benefit on the host.1 The probiotic concept dates back over 100 years, but only in recent times have the scientific knowledge and tools become available to properly evaluate their effects on normal health and well being, and their potential in preventing and treating disease. A similar situation exists for prebiotics, defined by this group as non-digestible substances that provide a beneficial physiological effect on the host by selectively stimulating the favorable growth or activity of a limited number of indigenous bacteria. Prebiotics function complementary to, and possibly synergistically with, probiotics. Numerous studies are providing insights into the growth and metabolic influence of these microbial nutrients on health. Today, the science behind the function of probiotics and prebiotics still requires more stringent deciphering both scientifically and mechanistically. The explosion of publications and interest in probiotics and prebiotics has resulted in a body of collective research that points toward great promise. However, this research is spread among such a diversity of organisms, delivery vehicles (foods, pills, and supplements), and potential health targets such that general conclusions cannot easily be made. Nevertheless, this situation is rapidly changing on a number of important fronts. With progress over the past decade on the genetics of lactic acid bacteria and the recent, 2,3 and pending, 4 release of complete genome sequences for major probiotic species, the field is now armed with detailed information and sophisticated microbiological and bioinformatic tools. Similarly, advances in biotechnology could yield new probiotics and prebiotics designed for enhanced or expanded functionality. The incorporation of genetic tools within a multidisciplinary scientific platform is expected to reveal the contributions of commensals, probiotics, and prebiotics to general health and well being and explicitly identify the mechanisms and corresponding host responses that provide the basis for their positive roles and associated claims. In terms of human suffering, the need for effective new approaches to prevent and treat disease is paramount. The need exists not only to alleviate the significant mortality and morbidity caused by intestinal diseases worldwide (especially diarrheal diseases in children), but also for infections at non-intestinal sites. This is especially worthy of pursuit in developing nations where mortality is too often the outcome of food and water borne infection. Inasmuch as probiotics and prebiotics are able to influence the populations or activities of commensal microflora, there is evidence that they can also play a role in mitigating some diseases. 5,6 Preliminary support that probiotics and prebiotics may be useful as intervention in conditions including inflammatory bowel disease, irritable bowel syndrome, allergy, cancer (especially colorectal cancer of which 75% are associated with diet), vaginal and urinary tract infections in women, kidney stone disease, mineral absorption, and infections caused by Helicobacter pylori is emerging. Some metabolites of microbes in the gut may also impact systemic conditions ranging from coronary heart disease to cognitive function, suggesting the possibility that exogenously applied microbes in the form of probiotics, or alteration of gut microecology with prebiotics, may be useful interventions even in these apparently disparate conditions. Beyond these direct intervention targets, probiotic cultures can also serve in expanded roles as live vehicles to deliver biologic agents (vaccines, enzymes, and proteins) to targeted locations within the body. The economic impact of these disease conditions in terms of diagnosis, treatment, doctor and hospital visits, and time off work exceeds several hundred billion dollars. The quality of life impact is also of major concern. Probiotics and prebiotics offer plausible opportunities to reduce the morbidity associated with these conditions. The following addresses issues that emerged from 8 workshops (Definitions, Intestinal Flora, Extra-Intestinal Sites, Immune Function, Intestinal Disease, Cancer, Genomics, and Second Generation Prebiotics), reflecting the current scientific state of probiotics and prebiotics. This is not a comprehensive review, however the study emphasizes pivotal knowledge gaps, and recommendations are made as to the underlying scientific and multidisciplinary studies that will be required to advance our understanding of the roles and impact of prebiotics, probiotics, and the commensal microflora upon health and disease management.
Resumo:
There are three key driving forces behind the development of Internet Content Management Systems (CMS) - a desire to manage the explosion of content, a desire to provide structure and meaning to content in order to make it accessible, and a desire to work collaboratively to manipulate content in some meaningful way. Yet the traditional CMS has been unable to meet the latter of these requirements, often failing to provide sufficient tools for collaboration in a distributed context. Peer-to-Peer (P2P) systems are networks in which every node is an equal participant (whether transmitting data, exchanging content, or invoking services) and there is an absence of any centralised administrative or coordinating authorities. P2P systems are inherently more scalable than equivalent client-server implementations as they tend to use resources at the edge of the network much more effectively. This paper details the rationale and design of a P2P middleware for collaborative content management.
Resumo:
Overseas trained teachers (OTTs) have grown in numbers during the past decade, particularly in London and the South East of England. In this recruitment explosion many OTTs have experienced difficulties. In professional literature as well as press coverage OTTs often become part of a deficit discourse. A small-scale pilot investigation of OTT experience has begun to suggest why OTTs have been successful as well as the principal challenges they have faced. An important factor in their success was felt to be the quality of support in school from others on the staff. Major challenges included the complexity of the primary curriculum. The argument that globalisation leads to brain-drain may be exaggerated. Suggestions for further research are made, which might indicate the positive benefits OTTs can bring to a school.
Resumo:
High-resolution satellite radar observations of erupting volcanoes can yield valuable information on rapidly changing deposits and geomorphology. Using the TerraSAR-X (TSX) radar with a spatial resolution of about 2 m and a repeat interval of 11-days, we show how a variety of techniques were used to record some of the eruptive history of the Soufriere Hills Volcano, Montserrat between July 2008 and February 2010. After a 15-month pause in lava dome growth, a vulcanian explosion occurred on 28 July 2008 whose vent was hidden by dense cloud. We were able to show the civil authorities using TSX change difference images that this explosion had not disrupted the dome sufficient to warrant continued evacuation. Change difference images also proved to be valuable in mapping new pyroclastic flow deposits: the valley-occupying block-and-ash component tending to increase backscatter and the marginal surge deposits reducing it, with the pattern reversing after the event. By comparing east- and west-looking images acquired 12 hours apart, the deposition of some individual pyroclastic flows can be inferred from change differences. Some of the narrow upper sections of valleys draining the volcano received many tens of metres of rockfall and pyroclastic flow deposits over periods of a few weeks. By measuring the changing shadows cast by these valleys in TSX images the changing depth of infill by deposits could be estimated. In addition to using the amplitude data from the radar images we also used their phase information within the InSAR technique to calculate the topography during a period of no surface activity. This enabled areas of transient topography, crucial for directing future flows, to be captured.
Resumo:
It has been known for decades that the metabolic rate of animals scales with body mass with an exponent that is almost always <1, >2/3, and often very close to 3/4. The 3/4 exponent emerges naturally from two models of resource distribution networks, radial explosion and hierarchically branched, which incorporate a minimum of specific details. Both models show that the exponent is 2/3 if velocity of flow remains constant, but can attain a maximum value of 3/4 if velocity scales with its maximum exponent, 1/12. Quarterpower scaling can arise even when there is no underlying fractality. The canonical “fourth dimension” in biological scaling relations can result from matching the velocity of flow through the network to the linear dimension of the terminal “service volume” where resources are consumed. These models have broad applicability for the optimal design of biological and engineered systems where energy, materials, or information are distributed from a single source.
Resumo:
With the increasing awareness of protein folding disorders, the explosion of genomic information, and the need for efficient ways to predict protein structure, protein folding and unfolding has become a central issue in molecular sciences research. Molecular dynamics computer simulations are increasingly employed to understand the folding and unfolding of proteins. Running protein unfolding simulations is computationally expensive and finding ways to enhance performance is a grid issue on its own. However, more and more groups run such simulations and generate a myriad of data, which raises new challenges in managing and analyzing these data. Because the vast range of proteins researchers want to study and simulate, the computational effort needed to generate data, the large data volumes involved, and the different types of analyses scientists need to perform, it is desirable to provide a public repository allowing researchers to pool and share protein unfolding data. This paper describes efforts to provide a grid-enabled data warehouse for protein unfolding data. We outline the challenge and present first results in the design and implementation of the data warehouse.