91 resultados para Problems of consumption


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Iron is essential to virtually all organisms, but poses problems of toxicity and poor solubility. Bacteria have evolved various mechanisms to counter the problems imposed by their iron dependence, allowing them to achieve effective iron homeostasis under a range of iron regimes. Highly efficient iron acquisition systems are used to scavenge iron from the environment under iron-restricted conditions. In many cases, this involves the secretion and internalisation of extracellular ferric chelators called siderophores. Ferrous iron can also be directly imported by the G protein-like transporter, FcoB. For pathogens, host-iron complexes (transferrin, lactoferrin, haem, haemoglobin) are directly used as iron sources. Bacterial iron storage proteins (ferritin, bacterioferritin) provide intracellular iron reserves for use when external supplies are restricted, and iron detoxification proteins (Dps) are employed to protect the chromosome from iron-induced free radical damage. There is evidence that bacteria control their iron requirements in response to iron availability by downregulating the expression of iron proteins during iron-restricted growth. And finally, the expression of the iron homeostatic machinery is subject to iron-dependent global control ensuring that iron acquisition, storage and consumption are geared to iron availability and that intracellular levels of free iron do not reach toxic levels. (C) 2003 Federation of European Microbiological Societies. Published by Elsevier Science B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

While only about 1-200 species are used intensively in commercial floriculture (e.g. carnations, chrysanthemums, gerbera, narcissus, orchids, tulips, lilies, roses, pansies and violas, saintpaulias, etc.) and 4-500 as house plants, several thousand species of herbs, shrubs and trees are traded commercially by nurseries and garden centres as ornamentals or amenity species. Most of these have been introduced from the wild with little selection or breeding. In Europe alone, 12 000 species are found in cultivation in general garden collections (i.e. excluding specialist collections and botanic gardens). In addition, specialist collections (often very large) of many other species and/or cultivars of groups such as orchids, bromeliads, cacti and succulents, primulas, rhododendrons, conifers and cycads are maintained in several centres such as botanic gardens and specialist nurseries, as are 'national collections' of cultivated species and cultivars in some countries. Specialist growers, both professional and amateur, also maintain collections of plants for cultivation, including, increasingly, native plants. The trade in ornamental and amenity horticulture cannot be fully estimated but runs into many billions of dollars annually and there is considerable potential for further development and the introduction of many new species into the trade. Despite this, most of the collections are ad hoc and no co-ordinated efforts have been made to ensure that adequate germplasm samples of these species are maintained for conservation purposes and few of them are represented at all adequately in seed banks. Few countries have paid much attention to germplasm needs of ornamentals and the Ornamental Plant Germplasm Center in conjunction with the USDA National Plant Germplasm System at The Ohio State University is an exception. Generally there is a serious gap in national and international germplasm strategies, which have tended to focus primarily on food plants and some forage and industrial crops. Adequate arrangements need to be put in place to ensure the long- and medium-term conservation of representative samples of the genetic diversity of ornamental species. The problems of achieving this will be discussed. In addition, a policy for the conservation of old cultivars or 'heritage' varieties of ornamentals needs to be formulated. The considerable potential for introduction of new ornamental species needs to be assessed. Consideration needs to be given to setting up a co-ordinating structure with overall responsibility for the conservation of germplasm of ornamental and amenity plants.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We report the first systematic study on the photocatalytic oxidation of humic acid (HA) in artificial seawater (ASW). TiO2 (Degussa P25) dispersions were used as the catalyst with irradiation from a medium-pressure mercury lamp. The optimum quantity of catalyst was found to be between 2 and 2.5 g l(-1); whiled the decomposition was fastest at low pH values (pH 4.5 in the range examined), and the optimum air-flow, using an immersion well reactor with a capacity of 400 ml, was 850 ml min(-1). Reactivity increased with air-flow up to this figure, above which foaming prevented operation of the reactor. Using pure. oxygen, an optimal flow rate was observed at 300 nil min(-1), above which reactivity remains essentially constant. Following treatment for 1 h, low-salinity water (2700 mg l(-1)) was completely mineralised, whereas ASW (46000 mg l(-1)) had traces of HA remaining. These effects are interpreted and kinetic data presented. To avoid problems of precipitation due to change of ionic strength humic substances were prepared directly in ASW, and the effects of ASW on catalyst suspension and precipitation have been taken into account. The Langmuir-Hinshelwood kinetic model has been shown to be followed only approximately for the catalytic oxidation of HA in ASW. The activation energy for the reaction derived from an Arrhenius treatment was 17 ( +/-0.6) kJ mol(-1). (C) 2003 Elsevier Science Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Time-resolved studies of silylene, SiH2, and dimethylsilylene, SiMe2, generated by the 193 nm laser flash photolysis of appropriate precursor molecules have been carried out to obtain rate constants for their bimolecular reactions with dimethylgermane, Me2GeH2, in the gas phase. SiMe2 + Me2GeH2 was studied at five temperatures in the range 299-555 K. Problems of substrate UV absorption at 193 nm at temperatures above 400 K meant that only three temperatures could be used reliably for rate constant measurement. These rate constants gave the Arrhenius parameters log(A/cm(3) molecule(-1) s(-1)) = -13.25 +/- 0.16 and E-a = -(5.01 +/- 1.01) kJ mol(-1). Only room temperature studies of SiH2 were carried out. These gave values of (4.05 +/- 0.06) x 10(-10) cm(3) molecule(-1) s(-1) (SiH2 + Me2GeH2 at 295 K) and also (4.41 +/- 0.07) x 10(-10) cm(3) molecule(-1) s(-1) (SiH2 + MeGeH3 at 296 K). Rate constant comparisons show the surprising result that SiMe2 reacts 12.5 times slower with Me2GeH2 than with Me2SiH2. Quantum chemical calculations (G2(MP2,SVP)//B3LYP level) of the model Si-H and Ge-H insertion processes of SiMe2 with SiH4/MeSiH3 and GeH4/MeGeH3 support these findings and show that the lower reactivity of SiMe2 with Ge-H bonds is caused by a higher secondary barrier for rearrangement of the initially formed complexes. Full details of the structures of intermediate complexes and the discussion of their stabilities are given in the paper. Other, related, comparisons of silylene reactivity are also presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The assumption that negligible work is involved in the formation of new surfaces in the machining of ductile metals, is re-examined in the light of both current Finite Element Method (FEM) simulations of cutting and modern ductile fracture mechanics. The work associated with separation criteria in FEM models is shown to be in the kJ/m2 range rather than the few J/m2 of the surface energy (surface tension) employed by Shaw in his pioneering study of 1954 following which consideration of surface work has been omitted from analyses of metal cutting. The much greater values of surface specific work are not surprising in terms of ductile fracture mechanics where kJ/m2 values of fracture toughness are typical of the ductile metals involved in machining studies. This paper shows that when even the simple Ernst–Merchant analysis is generalised to include significant surface work, many of the experimental observations for which traditional ‘plasticity and friction only’ analyses seem to have no quantitative explanation, are now given meaning. In particular, the primary shear plane angle φ becomes material-dependent. The experimental increase of φ up to a saturated level, as the uncut chip thickness is increased, is predicted. The positive intercepts found in plots of cutting force vs. depth of cut, and in plots of force resolved along the primary shear plane vs. area of shear plane, are shown to be measures of the specific surface work. It is demonstrated that neglect of these intercepts in cutting analyses is the reason why anomalously high values of shear yield stress are derived at those very small uncut chip thicknesses at which the so-called size effect becomes evident. The material toughness/strength ratio, combined with the depth of cut to form a non-dimensional parameter, is shown to control ductile cutting mechanics. The toughness/strength ratio of a given material will change with rate, temperature, and thermomechanical treatment and the influence of such changes, together with changes in depth of cut, on the character of machining is discussed. Strength or hardness alone is insufficient to describe machining. The failure of the Ernst–Merchant theory seems less to do with problems of uniqueness and the validity of minimum work, and more to do with the problem not being properly posed. The new analysis compares favourably and consistently with the wide body of experimental results available in the literature. Why considerable progress in the understanding of metal cutting has been achieved without reference to significant surface work is also discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Climate change is one of the major challenges facing economic systems at the start of the 21st century. Reducing greenhouse gas emissions will require both restructuring the energy supply system (production) and addressing the efficiency and sufficiency of the social uses of energy (consumption). The energy production system is a complicated supply network of interlinked sectors with 'knock-on' effects throughout the economy. End use energy consumption is governed by complex sets of interdependent cultural, social, psychological and economic variables driven by shifts in consumer preference and technological development trajectories. To date, few models have been developed for exploring alternative joint energy production-consumption systems. The aim of this work is to propose one such model. This is achieved in a methodologically coherent manner through integration of qualitative input-output models of production, with Bayesian belief network models of consumption, at point of final demand. The resulting integrated framework can be applied either (relatively) quickly and qualitatively to explore alternative energy scenarios, or as a fully developed quantitative model to derive or assess specific energy policy options. The qualitative applications are explored here.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article aims to create intellectual space in which issues of social inequality and education can be analyzed and discussed in relation to the multifaceted and multi-levelled complexities of the modern world. It is divided into three sections. Section One locates the concept of social class in the context of the modern nation state during the period after the Second World War. Focusing particularly on the impact of 'Fordism' on social organization and cultural relations, it revisits the articulation of social justice issues in the United Kingdom, and the structures put into place at the time to alleviate educational and social inequalities. Section Two problematizes the traditional concept of social class in relation to economic, technological and sociocultural changes that have taken place around the world since the mid-1980s. In particular, it charts some of the changes to the international labour market and global patterns of consumption, and their collective impact on the re-constitution of class boundaries in 'developed countries'. This is juxtaposed with some of the major social effects of neo-classical economic policies in recent years on the sociocultural base in developing countries. It discusses some of the ways these inequalities are reflected in education. Section Three explores tensions between the educational ideals of the 'knowledge economy' and the discursive range of social inequalities that are emerging within and beyond the nation state. Drawing on key motifs identified throughout, the article concludes with a reassessment of the concept of social class within the global cultural economy. This is discussed in relation to some of the major equity and human rights issues in education today.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Social phobia aggregates in families. The genetic contribution to intergenerational transmission is modest, and parenting is considered important. Research on the effects of social phobia on parenting has been subject to problems of small sample size, heterogeneity of samples and lack of specificity of observational frameworks. We addressed these problems in the current study.Methods: We assessed mothers with social phobia (N = 84) and control mothers (N = 89) at 10 weeks in face-to-face interactions with their infants, and during a social challenge, namely, engaging with a stranger. We also assessed mothers with generalised anxiety disorder (GAD) (N = 50). We examined the contribution to infant social responsiveness of early infant characteristics (neonatal irritability), as well as maternal behaviour. Results: Mothers with social phobia were no less sensitive to their infants during face-to-face interactions than control mothers, but when interacting with the stranger they appeared more anxious, engaged less with the stranger themselves, and were less encouraging of the infant's interaction with the stranger; infants of index mothers also showed reduced social responsiveness to the stranger. These differences did not apply to mothers with GAD and their infants. Regression analyses showed that the reduction in social responsiveness in infants of mothers with social phobia was predicted by neonatal irritability and the degree to which the mother encouraged the infant to interact with the stranger.Conclusions: Mothers with social phobia show specific parenting difficulties, and their infants show early signs of reduced social responsiveness that are related to both individual infant differences and a lack of maternal encouragement to engage in social interactions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aim: To review current literature on the development of convergence and accommodation. The accommodation and vergence systems provide the foundation upon which bifoveal binocular single vision develops. Deviations from their normal development not only are implicated in the aetiology of convergence anomalies, accommodative anomalies and strabismus, but may also be implicated in failure of the emmetropisation process. Method: This review considers the problems of researching the development of accommodation and vergence in infants and how infant research has had to differ from adult methods. It then reviews and discusses the implications of current research into the development of both systems and their linkages. Results: Vergence and accommodation develop rapidly in the first months of life, with accommodation changing from relatively fixed myopic focus in the neonatal period to adult-like responses by 4 months of age. Vergence develops gradually and becomes more accurate after 4 months of age, but has been demonstrated in infants well before the age that binocular disparity detection mechanisms are thought to develop. Hypotheses for this early vergence mechanism are discussed. The relationship between accommodation and vergence shows much more variability in infants than adult literature has found, but this apparent adult/infant difference may be partly attributed to methodological differences rather than maturational change alone. Conclusions: Variability and flexibility characterise infant responses. This variability may enable infants to develop a flexible and robust binocular system for later life. Studies of infant visual cue use may give clues to the aetiology of strabismus and refractive error.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Patients want and need comprehensive and accurate information about their medicines so that they can participate in decisions about their healthcare: In particular, they require information about the likely risks and benefits that are associated with the different treatment options. However, to provide this information in a form that people can readily understand and use is a considerable challenge to healthcare professionals. One recent attempt to standardise the Language of risk has been to produce sets of verbal descriptors that correspond to specific probability ranges, such as those outlined in the European Commission (EC) Pharmaceutical Committee guidelines in 1998 for describing the incidence of adverse effects. This paper provides an overview of a number of studies involving members of the general public, patients, and hospital doctors, that evaluated the utility of the EC guideline descriptors (very common, common, uncommon, rare, very rare). In all studies it was found that people significantly over-estimated the likelihood of adverse effects occurring, given specific verbal descriptors. This in turn resulted in significantly higher ratings of their perceived risks to health and significantly lower ratings of their likelihood of taking the medicine. Such problems of interpretation are not restricted to the EC guideline descriptors. Similar levels of misinterpretation have also been demonstrated with two other recently advocated risk scales (Caiman's verbal descriptor scale and Barclay, Costigan and Davies' lottery scale). In conclusion, the challenge for risk communicators and for future research will be to produce a language of risk that is sufficiently flexible to take into account different perspectives, as well as changing circumstances and contexts of illness and its treatments. In the meantime, we urge the EC and other legislative bodies to stop recommending the use of specific verbal labels or phrases until there is a stronger evidence base to support their use.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we consider bilinear forms of matrix polynomials and show that these polynomials can be used to construct solutions for the problems of solving systems of linear algebraic equations, matrix inversion and finding extremal eigenvalues. An almost Optimal Monte Carlo (MAO) algorithm for computing bilinear forms of matrix polynomials is presented. Results for the computational costs of a balanced algorithm for computing the bilinear form of a matrix power is presented, i.e., an algorithm for which probability and systematic errors are of the same order, and this is compared with the computational cost for a corresponding deterministic method.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The evaluation of EU policy in the area of rural land use management often encounters problems of multiple and poorly articulated objectives. Agri-environmental policy has a range of aims, including natural resource protection, biodiversity conservation and the protection and enhancement of landscape quality. Forestry policy, in addition to production and environmental objectives, increasingly has social aims, including enhancement of human health and wellbeing, lifelong learning, and the cultural and amenity value of the landscape. Many of these aims are intangible, making them hard to define and quantify. This article describes two approaches for dealing with such situations, both of which rely on substantial participation by stakeholders. The first is the Agri-Environment Footprint Index, a form of multi-criteria participatory approach. The other, applied here to forestry, has been the development of ‘multi-purpose’ approaches to evaluation, which respond to the diverse needs of stakeholders through the use of mixed methods and a broad suite of indicators, selected through a participatory process. Each makes use of case studies and involves stakeholders in the evaluation process, thereby enhancing their commitment to the programmes and increasing their sustainability. Both also demonstrate more ‘holistic’ approaches to evaluation than the formal methods prescribed in the EU Common Monitoring and Evaluation Framework.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper summarizes the theory of simple cumulative risks—for example, the risk of food poisoning from the consumption of a series of portions of tainted food. Problems concerning such risks are extraordinarily difficult for naı¨ve individuals, and the paper explains the reasons for this difficulty. It describes how naı¨ve individuals usually attempt to estimate cumulative risks, and it outlines a computer program that models these methods. This account predicts that estimates can be improved if problems of cumulative risk are framed so that individuals can focus on the appropriate subset of cases. The paper reports two experiments that corroborated this prediction. They also showed that whether problems are stated in terms of frequencies (80 out of 100 people got food poisoning) or in terms of percentages (80% of people got food poisoning) did not reliably affect accuracy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the emerging digital economy, the management of information in aerospace and construction organisations is facing a particular challenge due to the ever-increasing volume of information and the extensive use of information and communication technologies (ICTs). This paper addresses the problems of information overload and the value of information in both industries by providing some cross-disciplinary insights. In particular it identifies major issues and challenges in the current information evaluation practice in these two industries. Interviews were conducted to get a spectrum of industrial perspectives (director/strategic, project management and ICT/document management) on these issues in particular to information storage and retrieval strategies and the contrasting approaches to knowledge and information management of personalisation and codification. Industry feedback was collected by a follow-up workshop to strengthen the findings of the research. An information-handling agenda is outlined for the development of a future Information Evaluation Methodology (IEM) which could facilitate the practice of the codification of high-value information in order to support through-life knowledge and information management (K&IM) practice.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The modelling of a nonlinear stochastic dynamical processes from data involves solving the problems of data gathering, preprocessing, model architecture selection, learning or adaptation, parametric evaluation and model validation. For a given model architecture such as associative memory networks, a common problem in non-linear modelling is the problem of "the curse of dimensionality". A series of complementary data based constructive identification schemes, mainly based on but not limited to an operating point dependent fuzzy models, are introduced in this paper with the aim to overcome the curse of dimensionality. These include (i) a mixture of experts algorithm based on a forward constrained regression algorithm; (ii) an inherent parsimonious delaunay input space partition based piecewise local lineal modelling concept; (iii) a neurofuzzy model constructive approach based on forward orthogonal least squares and optimal experimental design and finally (iv) the neurofuzzy model construction algorithm based on basis functions that are Bézier Bernstein polynomial functions and the additive decomposition. Illustrative examples demonstrate their applicability, showing that the final major hurdle in data based modelling has almost been removed.