857 resultados para Individual and Organization
Resumo:
Four hull-less barley samples were milled on a Buhler MLU 202 laboratory mill and individual and combined milling fractions were characterized. The best milling performance was obtained when the samples were conditioned to 14.3% moisture. Yields were 37-48% for straight-run flour, 47-56% for shorts, and 5-8% for bran. The beta-glucan contents of the straight-run white flours were 1.6-2.1%, of which approximate to49% was water-extractable. The arabinoxylan contents were 1.2-1.5%, of which approximate to17% was water-extractable. Shorts and bran fractions contained more beta-glucan (4.2-5.8% and 3.0-4.7%, respectively) and arabinoxylan (6.1-7.7% and 8.1-11.8%, respectively) than the white flours. For those fractions, beta-glucan extractability was high (58.5 and 52.3%, respectively), whereas arabinoxylan extractability was very low (approximate to6.5 and 2.0%, respectively). The straight-run white flours had low alpha-amylase, beta-glucanase, and endoxylanase activities. The highest alpha-amylase activity was found in the shorts fractions and the highest beta-glucanase and endoxylanase activities were generally found in the bran fractions. Endoxylanase inhibitor activities were low in the white flours and highest in the shorts fractions. High flavanoid, tocopherol, and tocotrienol contents were found in bran and shorts fractions.
Resumo:
The early eighties saw the introduction of liposomes as skin drug delivery systems, initially promoted primarily for localised effects with minimal systemic delivery. Subsequently, a novel ultradeformable vesicular system (termed "Transfersomes" by the inventors) was reported for transdermal delivery with an efficiency similar to subcutaneous injection. Further research illustrated that the mechanisms of liposome action depended on the application regime and the vesicle composition and morphology. Ethical, health and supply problems with human skin have encouraged researchers to use skin models. 'IYaditional models involved polymer membranes and animal tissue, but whilst of value for release studies, such models are not always good mimics for the complex human skin barrier, particularly with respect to the stratum corneal intercellular lipid domains. These lipids have a multiply bilayered organization, a composition and organization somewhat similar to liposomes, Consequently researchers have used vesicles as skin model membranes. Early work first employed phospholipid liposomes and tested their interactions with skin penetration enhancers, typically using thermal analysis and spectroscopic analyses. Another approach probed how incorporation of compounds into liposomes led to the loss of entrapped markers, analogous to "fluidization" of stratum corneum lipids on treatment with a penetration enhancer. Subsequently scientists employed liposomes formulated with skin lipids in these types of studies. Following a brief description of the nature of the skin barrier to transdermal drug delivery and the use of liposomes in drug delivery through skin, this article critically reviews the relevance of using different types of vesicles as a model for human skin in permeation enhancement studies, concentrating primarily on liposomes after briefly surveying older models. The validity of different types of liposome is considered and traditional skin models are compared to vesicular model membranes for their precision and accuracy as skin membrane mimics. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
A signalling procedure is described involving a connection, via the Internet, between the nervous system of an able-bodied individual and a robotic prosthesis, and between the nervous systems of two able-bodied human subjects. Neural implant technology is used to directly interface each nervous system with a computer. Neural motor unit and sensory receptor recordings are processed real-time and used as the communication basis. This is seen as a first step towards thought communication, in which the neural implants would be positioned in the central nervous systems of two individuals.
Resumo:
A poor representation of cloud structure in a general circulation model (GCM) is widely recognised as a potential source of error in the radiation budget. Here, we develop a new way of representing both horizontal and vertical cloud structure in a radiation scheme. This combines the ‘Tripleclouds’ parametrization, which introduces inhomogeneity by using two cloudy regions in each layer as opposed to one, each with different water content values, with ‘exponential-random’ overlap, in which clouds in adjacent layers are not overlapped maximally, but according to a vertical decorrelation scale. This paper, Part I of two, aims to parametrize the two effects such that they can be used in a GCM. To achieve this, we first review a number of studies for a globally applicable value of fractional standard deviation of water content for use in Tripleclouds. We obtain a value of 0.75 ± 0.18 from a variety of different types of observations, with no apparent dependence on cloud type or gridbox size. Then, through a second short review, we create a parametrization of decorrelation scale for use in exponential-random overlap, which varies the scale linearly with latitude from 2.9 km at the Equator to 0.4 km at the poles. When applied to radar data, both components are found to have radiative impacts capable of offsetting biases caused by cloud misrepresentation. Part II of this paper implements Tripleclouds and exponential-random overlap into a radiation code and examines both their individual and combined impacts on the global radiation budget using re-analysis data.
Resumo:
The M protein of coronavirus plays a central role in virus assembly, turning cellular membranes into workshops where virus and host factors come together to make new virus particles. We investigated how M structure and organization is related to virus shape and size using cryo-electron microscopy, tomography and statistical analysis. We present evidence that suggests M can adopt two conformations and that membrane curvature is regulated by one M conformer. Elongated M protein is associated with rigidity, clusters of spikes and a relatively narrow range of membrane curvature. In contrast, compact M protein is associated with flexibility and low spike density. Analysis of several types of virus-like particles and virions revealed that S protein, N protein and genomic RNA each help to regulate virion size and variation, presumably through interactions with M. These findings provide insight into how M protein functions to promote virus assembly.
Resumo:
A rapid thiolytic degradation and cleanup procedure was developed for analyzing tannins directly in chlorophyll-containing sainfoin (Onobrychis viciifolia) plants. The technique proved suitable for complex tannin mixtures containing catechin, epicatechin, gallocatechin, and epigallocatechin flavan-3-ol units. The reaction time was standardized at 60 min to minimize the loss of structural information as a result of epimerization and degradation of terminal flavan-3-ol units. The results were evaluated by separate analysis of extractable and unextractable tannins, which accounted for 63.6−113.7% of the in situ plant tannins. It is of note that 70% aqueous acetone extracted tannins with a lower mean degree of polymerization (mDP) than was found for tannins analyzed in situ. Extractable tannins had between 4 and 29 lower mDP values. The method was validated by comparing results from individual and mixed sample sets. The tannin composition of different sainfoin accessions covered a range of mDP values from 16 to 83, procyanidin/prodelphinidin (PC/PD) ratios from 19.2/80.8 to 45.6/54.4, and cis/trans ratios from 74.1/25.9 to 88.0/12.0. This is the first high-throughput screening method that is suitable for analyzing condensed tannin contents and structural composition directly in green plant tissue.
Resumo:
As healthcare costs rise and an aging population makes an increased demand on services, so new techniques must be introduced to promote an individuals independence and provide these services. Robots can now be designed so they can alter their dynamic properties changing from stiff to flaccid, or from giving no resistance to movement, to damping any large and sudden movements. This has some strong implications in health care in particular for rehabilitation where a robot must work in conjunction with an individual, and might guiding or assist a persons arm movements, or might be commanded to perform some set of autonomous actions. This paper presents the state-of-the-art of rehabilitation robots with examples from prosthetics, aids for daily living and physiotherapy. In all these situations there is the potential for the interaction to be non-passive with a resulting potential for the human/machine/environment combination to become unstable. To understand this instability we must develop better models of the human motor system and fit these models with realistic parameters. This paper concludes with a discussion of this problem and overviews some human models that can be used to facilitate the design of the human/machine interfaces.
Resumo:
Altruism and selfishness are 30–50% heritable in man in both Western and non-Western populations. This genetically based variation in altruism and selfishness requires explanation. In non-human animals, altruism is generally directed towards relatives, and satisfies the condition known as Hamilton's rule. This nepotistic altruism evolves under natural selection only if the ratio of the benefit of receiving help to the cost of giving it exceeds a value that depends on the relatedness of the individuals involved. Standard analyses assume that the benefit provided by each individual is the same but it is plausible in some cases that as more individuals contribute, help is subject to diminishing returns. We analyse this situation using a single-locus two-allele model of selection in a diploid population with the altruistic allele dominant to the selfish allele. The analysis requires calculation of the relationship between the fitnesses of the genotypes and the frequencies of the genes. The fitnesses vary not only with the genotype of the individual but also with the distribution of phenotypes amongst the sibs of the individual and this depends on the genotypes of his parents. These calculations are not possible by direct fitness or ESS methods but are possible using population genetics. Our analysis shows that diminishing returns change the operation of natural selection and the outcome can now be a stable equilibrium between altruistic and selfish alleles rather than the elimination of one allele or the other. We thus provide a plausible genetic model of kin selection that leads to the stable coexistence in the same population of both altruistic and selfish individuals. This may explain reported genetic variation in altruism in man.
Resumo:
Literacy as a social practice is integrally linked with social, economic and political institutions and processes. As such, it has a material base which is fundamentally constituted in power relations. Literacy is therefore interwoven with the text and context of everyday living in which multi-levelled meanings are organically produced at both individual and societal level. This paper argues that if language thus mediates social reality, then it follows that literacy defined as a social practice cannot really be addressed as a reified, neutral activity but that it should take account of the social, cultural and political processes in which literacy practices are embedded. Drawing on the work of key writers within the field, the paper foregrounds the primary role of the state in defining the forms and levels of literacy required and made available at particular moments within society. In a case-study of the social construction of literacy meanings in pre-revolutionary Iran, it explores the view that the discourse about societal literacy levels has historically constituted a key terrain in which the struggle for control over meaning has taken place. This struggle, it is argued, sets the interests of the state to maintain ideological and political control over the production of knowledge within the culture and society over and against the needs identified by the individual for personal development, empowerment and liberation. In an overall sense, the paper examines existing theoretical perspectives on societal literacy programmes in terms of the scope that they provide for analyses that encompass the multi-levelled power relations that shape and influence dominant discourses on the relative value of literacy for both the individual and society
Resumo:
This paper discusses concepts of value from the point of view of the user of the space and the counter view of the provider of the same. Land and property are factors of production. The value of the land flows from the use to which it is put, and that in turn, is dependent upon the demand (and supply) for the product or service that is produced/provided from that space. If there is a high demand for the product (at a fixed level of supply), the price will increase and the economic rent for the land/property will increase accordingly. This is the underlying paradigm of Ricardian rent theory where the supply of land is fixed and a single good is produced. In such a case the rent of land is wholly an economic rent. Economic theory generally distinguishes between two kinds of price, price of production or “value in use” (as determined by the labour theory of value), and market price or “value in exchange” (as determined by supply and demand). It is based on a coherent and consistent theory of value and price. Effectively the distinction is between what space is ‘worth’ to an individual and that space’s price of exchange in the market place. In a perfect market where any individual has access to the same information as all others in the market, price and worth should coincide. However in a market where access to information is not uniform, and where different uses compete for the same space, it is more likely that the two figures will diverge. This paper argues that the traditional reliance of valuers to use methods of comparison to determine “price” has led to an artificial divergence of “value in use” and “value in exchange”, but now such comparison are becoming more difficult due to the diversity of lettings in the market place, there will be a requirement to return to fundamentals and pay heed to the thought process of the user in assessing the worth of the space to be let.
Resumo:
We review and structure some of the mathematical and statistical models that have been developed over the past half century to grapple with theoretical and experimental questions about the stochastic development of aging over the life course. We suggest that the mathematical models are in large part addressing the problem of partitioning the randomness in aging: How does aging vary between individuals, and within an individual over the lifecourse? How much of the variation is inherently related to some qualities of the individual, and how much is entirely random? How much of the randomness is cumulative, and how much is merely short-term flutter? We propose that recent lines of statistical inquiry in survival analysis could usefully grapple with these questions, all the more so if they were more explicitly linked to the relevant mathematical and biological models of aging. To this end, we describe points of contact among the various lines of mathematical and statistical research. We suggest some directions for future work, including the exploration of information-theoretic measures for evaluating components of stochastic models as the basis for analyzing experiments and anchoring theoretical discussions of aging.
Resumo:
This collection of original research and review articles and has been designed with the joint aims of inspiring future work and of reminding environmental economists and researchers from other disciplines that looking for similarities and common features in their studies is more important than magnifying their differences. It is also suitable for use as a postgraduate text. The volume reflects the endeavour of mainstream economic thought to include, amongst its chief concerns, the study of all complex interactions between economies and natural space. It also documents efforts made by economists and other scientists to study the complex phenomenon of individual and collective decision making when faced with problems linking economic activity with the environment. Presenting a pluralistic view of approaches and methodologies, rather than an exhaustive list of topics of interest to environmental scientists, the editors have brought together innovative contributions that can be read as self-contained pieces of work.
Resumo:
The Complex Adaptive Systems, Cognitive Agents and Distributed Energy (CASCADE) project is developing a framework based on Agent Based Modelling (ABM). The CASCADE Framework can be used both to gain policy and industry relevant insights into the smart grid concept itself and as a platform to design and test distributed ICT solutions for smart grid based business entities. ABM is used to capture the behaviors of diff erent social, economic and technical actors, which may be defi ned at various levels of abstraction. It is applied to understanding their interactions and can be adapted to include learning processes and emergent patterns. CASCADE models ‘prosumer’ agents (i.e., producers and/or consumers of energy) and ‘aggregator’ agents (e.g., traders of energy in both wholesale and retail markets) at various scales, from large generators and Energy Service Companies down to individual people and devices. The CASCADE Framework is formed of three main subdivisions that link models of electricity supply and demand, the electricity market and power fl ow. It can also model the variability of renewable energy generation caused by the weather, which is an important issue for grid balancing and the profi tability of energy suppliers. The development of CASCADE has already yielded some interesting early fi ndings, demonstrating that it is possible for a mediating agent (aggregator) to achieve stable demandfl attening across groups of domestic households fi tted with smart energy control and communication devices, where direct wholesale price signals had previously been found to produce characteristic complex system instability. In another example, it has demonstrated how large changes in supply mix can be caused even by small changes in demand profi le. Ongoing and planned refi nements to the Framework will support investigation of demand response at various scales, the integration of the power sector with transport and heat sectors, novel technology adoption and diffusion work, evolution of new smart grid business models, and complex power grid engineering and market interactions.