35 resultados para birth choices
Resumo:
Epidemiological studies suggest that low-birth weight infants show poor neonatal growth and increased susceptibility to metabolic syndrome, in particular, obesity and diabetes. Adipose tissue development is regulated by many genes, including members of the peroxisome proliferator-activated receptor (PPAR) and the fatty acid-binding protein (FABP) families. The aim of this study was to determine the influence of birth weight on key adipose and skeletal muscle tissue regulating genes. Piglets from 11 litters were ranked according to birth weight and 3 from each litter assigned to small, normal, or large-birth weight groups. Tissue samples were collected on day 7 or 14. Plasma metabolite concentrations and the expression of PPARG2, PPARA, FABP3, and FABP4 genes were determined in subcutaneous adipose tissue and skeletal muscle. Adipocyte number and area were determined histologically. Expression of FABP3 and 4 was significantly reduced in small and large, compared with normal, piglets in adipose tissue on day 7 and in skeletal muscle on day 14. On day 7, PPARA and PPARG2 were significantly reduced in adipose tissue from small and large piglets. Adipose tissue from small piglets contained more adipocytes than normal or large piglets. Birth weight had no effect on adipose tissue and skeletal muscle lipid content. Low-birth weight is associated with tissue-specific and time-dependent effects on lipid-regulating genes as well as morphological changes in adipose tissue. It remains to be seen whether these developmental changes alter an individual's susceptibility to metabolic syndrome.
Resumo:
The article confronts some key issues raised in the literature on public participation via a series of interrogatory questions drawn from rational choice theory. These are considered in relation to the design and process of public participation opportunities in planning and wider processes of local governance at the neighbourhood scale. In doing this, the article draws on recent research that has looked in some depth at a form of community-led planning (CLP) in England. The motives and expectations of participants, the abilities of participants, as well as the conditions in which participation takes place are seen as important factors. It is contended that the issues raised by rational choice theory are pertinent to emerging efforts to engage communities. As such, the article concludes that advocates of public participation or community engagement should not be afraid of responding to the challenges posed by questions of motive and reward of participants if lasting and worthwhile participation is to be established. Indeed, questions such as 'what's in it for me?' should be regarded as legitimate, necessary and indeed standard, in order to co-devise meaningful and durable participation opportunities and appropriate institutional environments. However, it is also maintained that wider considerations and capacity questions will also need to be confronted if participation is to become embedded as part of participatory neighbourhood-scale planning.
Resumo:
The family of theories dubbed ‘luck egalitarianism’ represent an attempt to infuse egalitarian thinking with a concern for personal responsibility, arguing that inequalities are just when they result from, or the extent to which they result from, choice, but are unjust when they result from, or the extent to which they result from, luck. In this essay I argue that luck egalitarians should sometimes seek to limit inequalities, even when they have a fully choice-based pedigree (i.e., result only from the choices of agents). I grant that the broad approach is correct but argue that the temporal standpoint from which we judge whether the person can be held responsible, or the extent to which they can be held responsible, should be radically altered. Instead of asking, as Standard (or Static) Luck Egalitarianism seems to, whether or not, or to what extent, a person was responsible for the choice at the time of choosing, and asking the question of responsibility only once, we should ask whether, or to what extent, they are responsible for the choice at the point at which we are seeking to discover whether, or to what extent, the inequality is just, and so the question of responsibility is not settled but constantly under review. Such an approach will differ from Standard Luck Egalitarianism only if responsibility for a choice is not set in stone – if responsibility can weaken then we should not see the boundary between luck and responsibility within a particular action as static. Drawing on Derek Parfit’s illuminating discussions of personal identity, and contemporary literature on moral responsibility, I suggest there are good reasons to think that responsibility can weaken – that we are not necessarily fully responsible for a choice for ever, even if we were fully responsible at the time of choosing. I call the variant of luck egalitarianism that recognises this shift in temporal standpoint and that responsibility can weaken Dynamic Luck Egalitarianism (DLE). In conclusion I offer a preliminary discussion of what kind of policies DLE would support.
Resumo:
Recent UK changes in the number of students entering higher education, and in the nature of financial support, highlight the complexity of students’ choices about human capital investments. Today’s students have to focus not on the relatively narrow issue of how much academic effort to invest, but instead on the more complicated issue of how to invest effort in pursuit of ‘employability skills’, and how to signal such acquisitions in the context of a highly competitive graduate jobs market. We propose a framework aimed specifically at students’ investment decisions, which encompasses corner solutions for both borrowing and employment while studying.
Resumo:
The complexity of current and emerging high performance architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven performance modelling approach is outlined that is appro- priate for modern multicore architectures. The approach is demonstrated by constructing a model of a simple shallow water code on a Cray XE6 system, from application-specific benchmarks that illustrate precisely how architectural char- acteristics impact performance. The model is found to recre- ate observed scaling behaviour up to 16K cores, and used to predict optimal rank-core affinity strategies, exemplifying the type of problem such a model can be used for.
Resumo:
There is strong evidence that neonates imitate previously unseen behaviors. These behaviors are predominantly used in social interactions, demonstrating neonates’ ability and motivation to engage with others. Research on neonatal imitation can provide a wealth of information about the early mirror neuron system (MNS): namely, its functional characteristics, its plasticity from birth, and its relation to skills later in development. Though numerous studies document the existence of neonatal imitation in the laboratory, little is known about its natural occurrence during parent-infant interactions and its plasticity as a consequence of experience. We review these critical aspects of imitation, which we argue are necessary for understanding the early action-perception system. We address common criticisms and misunderstandings about neonatal imitation and discuss methodological differences among studies. Recent work reveals that individual differences in neonatal imitation positively correlate with later social, cognitive, and motor development. We propose that such variation in neonatal imitation could reflect important individual differences of the MNS. Although postnatal experience is not necessary for imitation, we present evidence that neonatal imitation is influenced by experience in the first week of life.
Resumo:
This article reflects on a decade of British counterinsurgency operations. Questioning the idea that lessons have been learnt, the paper challenges the assumptions that are being used to frame future strategic choice. Suggesting that defence engagement is primarily focused on optimising overseas interventions while avoiding a deeper strategic reassessment about whether the UK should be undertaking these sorts of activities, the article calls for a proper debate on Britain's national security interests.
Resumo:
Background Polygalacturonase-inhibiting proteins (PGIPs) are leucine-rich repeat (LRR) plant cell wall glycoproteins involved in plant immunity. They are typically encoded by gene families with a small number of gene copies whose evolutionary origin has been poorly investigated. Here we report the complete characterization of the full complement of the pgip family in soybean (Glycine max [L.] Merr.) and the characterization of the genomic region surrounding the pgip family in four legume species. Results BAC clone and genome sequence analyses showed that the soybean genome contains two pgip loci. Each locus is composed of three clustered genes that are induced following infection with the fungal pathogen Sclerotinia sclerotiorum (Lib.) de Bary, and remnant sequences of pgip genes. The analyzed homeologous soybean genomic regions (about 126 Kb) that include the pgip loci are strongly conserved and this conservation extends also to the genomes of the legume species Phaseolus vulgaris L., Medicago truncatula Gaertn. and Cicer arietinum L., each containing a single pgip locus. Maximum likelihood-based gene trees suggest that the genes within the pgip clusters have independently undergone tandem duplication in each species. Conclusions The paleopolyploid soybean genome contains two pgip loci comprised in large and highly conserved duplicated regions, which are also conserved in bean, M. truncatula and C. arietinum. The genomic features of these legume pgip families suggest that the forces driving the evolution of pgip genes follow the birth-and-death model, similar to that proposed for the evolution of resistance (R) genes of NBS-LRR-type.
Resumo:
From the early Roman period, there is archaeological evidence for the exploitation of the Flemish coastal plain (Belgium) for a range of activities, such as sheep herding on the then developing salt-marshes and salt-meadows for the production of wool. During the early Middle Ages, this culminated in the establishment of dedicated ‘sheep estates’. This phase of exploitation was followed by extensive drainage and land reclamation measures in the high Medieval period, transforming areas into grassland, suited for cattle breeding. As part of a larger project investigating the onset, intensification and final decline of sheep management in coastal Flanders in the historical period, this pilot study presents the results of sequential sampling and oxygen isotope analysis of a number of sheep teeth (M2, n = 8) from four late Roman and Medieval sites (dating from 4th to 15th century AD), in order to assess potential variations in season of birth between the different sites and through time. In comparison with published data from herds of known birth season, incremental enamel data from the Flemish sites are consistent with late winter/spring births, with the possibility of some instances of slightly earlier parturition. These findings suggest that manipulation of season of birth was not a feature of the sheep husbandry-based economies of early historic Flanders, further evidencing that wool production was the main purpose of contemporary sheep rearing in the region. Manipulation of season of birth is not likely to have afforded economic advantage in wool-centred economies, unlike in some milk- or meat-based regimes.
A benchmark-driven modelling approach for evaluating deployment choices on a multi-core architecture
Resumo:
The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.