910 resultados para Andersen and Newman model
Resumo:
Age is the highest risk factor for some of the most prevalent human diseases, including cancer. Telomere shortening is thought to play a central role in the aging process in humans. The link between telomeres and aging is highlighted by the fact that genetic diseases causing telomerase deficiency are associated with premature aging and increased risk of cancer. For the last two decades, this link has been mostly investigated using mice that have long telomeres. However, zebrafish has recently emerged as a powerful and complementary model system to study telomere biology. Zebrafish possess human-like short telomeres that progressively decline with age, reaching lengths in old age that are observed when telomerase is mutated. The extensive characterization of its well-conserved molecular and cellular physiology makes this vertebrate an excellent model to unravel the underlying relationship between telomere shortening, tissue regeneration, aging and disease. In this Review, we explore the advantages of using zebrafish in telomere research and discuss the primary discoveries made in this model that have contributed to expanding our knowledge of how telomere attrition contributes to cellular senescence, organ dysfunction and disease.
Resumo:
Applications are subject of a continuous evolution process with a profound impact on their underlining data model, hence requiring frequent updates in the applications' class structure and database structure as well. This twofold problem, schema evolution and instance adaptation, usually known as database evolution, is addressed in this thesis. Additionally, we address concurrency and error recovery problems with a novel meta-model and its aspect-oriented implementation. Modern object-oriented databases provide features that help programmers deal with object persistence, as well as all related problems such as database evolution, concurrency and error handling. In most systems there are transparent mechanisms to address these problems, nonetheless the database evolution problem still requires some human intervention, which consumes much of programmers' and database administrators' work effort. Earlier research works have demonstrated that aspect-oriented programming (AOP) techniques enable the development of flexible and pluggable systems. In these earlier works, the schema evolution and the instance adaptation problems were addressed as database management concerns. However, none of this research was focused on orthogonal persistent systems. We argue that AOP techniques are well suited to address these problems in orthogonal persistent systems. Regarding the concurrency and error recovery, earlier research showed that only syntactic obliviousness between the base program and aspects is possible. Our meta-model and framework follow an aspect-oriented approach focused on the object-oriented orthogonal persistent context. The proposed meta-model is characterized by its simplicity in order to achieve efficient and transparent database evolution mechanisms. Our meta-model supports multiple versions of a class structure by applying a class versioning strategy. Thus, enabling bidirectional application compatibility among versions of each class structure. That is to say, the database structure can be updated because earlier applications continue to work, as well as later applications that have only known the updated class structure. The specific characteristics of orthogonal persistent systems, as well as a metadata enrichment strategy within the application's source code, complete the inception of the meta-model and have motivated our research work. To test the feasibility of the approach, a prototype was developed. Our prototype is a framework that mediates the interaction between applications and the database, providing them with orthogonal persistence mechanisms. These mechanisms are introduced into applications as an {\it aspect} in the aspect-oriented sense. Objects do not require the extension of any super class, the implementation of an interface nor contain a particular annotation. Parametric type classes are also correctly handled by our framework. However, classes that belong to the programming environment must not be handled as versionable due to restrictions imposed by the Java Virtual Machine. Regarding concurrency support, the framework provides the applications with a multithreaded environment which supports database transactions and error recovery. The framework keeps applications oblivious to the database evolution problem, as well as persistence. Programmers can update the applications' class structure because the framework will produce a new version for it at the database metadata layer. Using our XML based pointcut/advice constructs, the framework's instance adaptation mechanism is extended, hence keeping the framework also oblivious to this problem. The potential developing gains provided by the prototype were benchmarked. In our case study, the results confirm that mechanisms' transparency has positive repercussions on the programmer's productivity, simplifying the entire evolution process at application and database levels. The meta-model itself also was benchmarked in terms of complexity and agility. Compared with other meta-models, it requires less meta-object modifications in each schema evolution step. Other types of tests were carried out in order to validate prototype and meta-model robustness. In order to perform these tests, we used an OO7 small size database due to its data model complexity. Since the developed prototype offers some features that were not observed in other known systems, performance benchmarks were not possible. However, the developed benchmark is now available to perform future performance comparisons with equivalent systems. In order to test our approach in a real world scenario, we developed a proof-of-concept application. This application was developed without any persistence mechanisms. Using our framework and minor changes applied to the application's source code, we added these mechanisms. Furthermore, we tested the application in a schema evolution scenario. This real world experience using our framework showed that applications remains oblivious to persistence and database evolution. In this case study, our framework proved to be a useful tool for programmers and database administrators. Performance issues and the single Java Virtual Machine concurrent model are the major limitations found in the framework.
Resumo:
I investigate the effects of information frictions in price setting decisions. I show that firms' output prices and wages are less sensitive to aggregate economic conditions when firms and workers cannot perfectly understand (or know) the aggregate state of the economy. Prices and wages respond with a lag to aggregate innovations because agents learn slowly about those changes, and this delayed adjustment in prices makes output and unemployment more sensitive to aggregate shocks. In the first chapter of this dissertation, I show that workers' noisy information about the state of the economy help us to explain why real wages are sluggish. In the context of a search and matching model, wages do not immediately respond to a positive aggregate shock because workers do not (yet) have enough information to demand higher wages. This increases firms' incentives to post more vacancies, and it makes unemployment volatile and sensitive to aggregate shocks. This mechanism is robust to two major criticisms of existing theories of sluggish wages and volatile unemployment: the flexibility of wages for new hires and the cyclicality of the opportunity cost of employment. Calibrated to U.S. data, the model explains 60% of the overall unemployment volatility. Consistent with empirical evidence, the response of unemployment to TFP shocks predicted by my model is large, hump-shaped, and peaks one year after the TFP shock, while the response of the aggregate wage is weak and delayed, peaking after two years. In the second chapter of this dissertation, I study the role of information frictions and inventories in firms' price setting decisions in the context of a monetary model. In this model, intermediate goods firms accumulate output inventories, observe aggregate variables with one period lag, and observe their nominal input prices and demand at all times. Firms face idiosyncratic shocks and cannot perfectly infer the state of nature. After a contractionary nominal shock, nominal input prices go down, and firms accumulate inventories because they perceive some positive probability that the nominal price decline is due to a good productivity shock. This prevents firms' prices from decreasing and makes current profits, households' income, and aggregate demand go down. According to my model simulations, a 1% decrease in the money growth rate causes output to decline 0.17% in the first quarter and 0.38% in the second followed by a slow recovery to the steady state. Contractionary nominal shocks also have significant effects on total investment, which remains 1% below the steady state for the first 6 quarters.
Resumo:
The use of the ‘commission-accession’ principle as a mechanism for sustainable collecting in public museums and galleries has been significantly under-researched, only recently soliciting attention from national funding bodies in the United Kingdom (UK). This research has assessed an unfolding situation and provided a body of current evaluative evidence for commission-based acquisitions and a model for curators to use in future contemporary art purchases. ‘Commission-accession’ is a practice increasingly used by European and American museums yet has seen little uptake in the UK. Very recent examples demonstrate that new works produced via commissioning which then enter permanent collections, have significant financial and audience benefits that UK museums could harness, by drawing on the expertise of local and national commissioning organisations. Very little evaluative information is available on inter-institutional precedents in the United States (US) or ‘achat par commande’ in France. Neither is there yet literature that investigates the ambition for and viability of such models in the UK. This thesis addresses both of these areas, and provides evaluative case studies that will be of particular value to curators who seek sustainable ways to build their contemporary art collections. It draws on a survey of 82 museums and galleries across the UK conducted for this research, which provide a picture of where and how ‘commission-accession’ has been applied, and demonstrates its impacts as a strategy. In addition interviews with artists and curators in the UK, US and France on the social, economic and cultural implications of ‘commission-accession’ processes were undertaken. These have shed new light on issues inherent to the commissioning of contemporary art such as communication, trust, and risk as well as drawing attention to the benefits and challenges involved in commissioning as of yet unmade works of art.
Resumo:
Bactrocera tryoni (Froggatt) is Australia's major horticultural insect pest, yet monitoring females remains logistically difficult. We trialled the ‘Ladd trap’ as a potential female surveillance or monitoring tool. This trap design is used to trap and monitor fruit flies in countries other (e.g. USA) than Australia. The Ladd trap consists of a flat yellow panel (a traditional ‘sticky trap’), with a three dimensional red sphere (= a fruit mimic) attached in the middle. We confirmed, in field-cage trials, that the combination of yellow panel and red sphere was more attractive to B. tryoni than the two components in isolation. In a second set of field-cage trials, we showed that it was the red-yellow contrast, rather than the three dimensional effect, which was responsible for the trap's effectiveness, with B. tryoni equally attracted to a Ladd trap as to a two-dimensional yellow panel with a circular red centre. The sex ratio of catches was approximately even in the field-cage trials. In field trials, we tested the traditional red-sphere Ladd trap against traps for which the sphere was painted blue, black or yellow. The colour of sphere did not significantly influence trap efficiency in these trials, despite the fact the yellow-panel/yellow-sphere presented no colour contrast to the flies. In 6 weeks of field trials, over 1500 flies were caught, almost exactly two-thirds of them being females. Overall, flies were more likely to be caught on the yellow panel than the sphere; but, for the commercial Ladd trap, proportionally more females were caught on the red sphere versus the yellow panel than would be predicted based on relative surface area of each component, a result also seen the field-cage trial. We determined that no modification of the trap was more effective than the commercially available Ladd trap and so consider that product suitable for more extensive field testing as a B. tryoni research and monitoring tool.
Resumo:
This study investigated the role of fatalism as a cultural value orientation and causal attributions for past failure in the academic performance of high school students in the Araucania Region of Chile. Three thousand three hundred and fourty eight Mapuche and Non-Mapuche students participated in the study. Consistent with the Culture and Behavior model that guided the research, the test of causal models based on the analysis of structural equations show that academic performance is in part a function of variations in the level of fatalism, directly as well as indirectly through its influence in the attribution processes and failure-related emotions. In general, the model representing the proposed structure of relations among fatalism, attributions, and emotions as determinants of academic performance fit the data for both Mapuche and non-Mapuche students. However, results show that some of the relations in the model are different for students from these two ethnic groups. Finally, according to the results from the analysis of causal models, family SES appear to be the most important determinant of fatalism.
Resumo:
A method for systematically tracking swells across oceanic basins is developed by taking advantage of high-quality data from space-borne altimeters and wave model output. The evolution of swells is observed over large distances based on 202 swell events with periods ranging from 12 to 18 s. An empirical attenuation rate of swell energy of about 4 × 10−7 m−1 is estimated using these observations, and the nonbreaking energy dissipation rates of swells far away from their generating areas are also estimated using a point source model. The resulting acceptance range of nonbreaking dissipation rates is −2.5 to 5.0 × 10−7 m−1, which corresponds to a dissipation e-folding scales of at least 2000 km for steep swells, to almost infinite for small-amplitude swells. These resulting rates are consistent with previous studies using in-situ and synthetic aperture radar (SAR) observations. The frequency dispersion and angular spreading effects during swell propagation are discussed by comparing the results with other studies, demonstrating that they are the two dominant processes for swell height attenuation, especially in the near field. The resulting dissipation rates from these observations can be used as a reference for ocean engineering and wave modeling, and for related studies such as air-sea and wind-wave-turbulence interactions.
Resumo:
Thirty-four microsatellite loci were isolated from three reef fish species; golden snapper Lutjanus johnii, blackspotted croaker Protonibea diacanthus and grass emperor Lethrinus laticaudis using a next generation sequencing approach. Both IonTorrent single reads and Illumina MiSeq paired-end reads were used, with the latter demonstrating a higher quality of reads than the IonTorrent. From the 1–1.5 million raw reads per species, we successfully obtained 10–13 polymorphic loci for each species, which satisfied stringent design criteria. We developed multiplex panels for the amplification of the golden snapper and the blackspotted croaker loci, as well as post-amplification pooling panels for the grass emperor loci. The microsatellites characterized in this work were tested across three locations of northern Australia. The microsatellites we developed can detect population differentiation across northern Australia and may be used for genetic structure studies and stock identification.
Resumo:
Liquid-solid interactions become important as dimensions approach mciro/nano-scale. This dissertation focuses on liquid-solid interactions in two distinct applications: capillary driven self-assembly of thin foils into 3D structures, and droplet wetting of hydrophobic micropatterned surfaces. The phenomenon of self-assembly of complex structures is common in biological systems. Examples include self-assembly of proteins into macromolecular structures and self-assembly of lipid bilayer membranes. The principles governing this phenomenon have been applied to induce self-assembly of millimeter scale Si thin films into spherical and other 3D structures, which are then integrated into light-trapping photovoltaic (PV) devices. Motivated by this application, we present a generalized analytical study of the self-folding of thin plates into deterministic 3D shapes, through fluid-solid interactions, to be used as PV devices. This study consists of developing a model using beam theory, which incorporates the two competing components — a capillary force that promotes folding and the bending rigidity of the foil that resists folding into a 3D structure. Through an equivalence argument of thin foils of different geometry, an effective folding parameter, which uniquely characterizes the driving force for folding, has been identified. A criterion for spontaneous folding of an arbitrarily shaped 2D foil, based on the effective folding parameter, is thus established. Measurements from experiments using different materials and predictions from the model match well, validating the assumptions used in the analysis. As an alternative to the mechanics model approach, the minimization of the total free energy is employed to investigate the interactions between a fluid droplet and a flexible thin film. A 2D energy functional is proposed, comprising the surface energy of the fluid, bending energy of the thin film and gravitational energy of the fluid. Through simulations with Surface Evolver, the shapes of the droplet and the thin film at equilibrium are obtained. A critical thin film length necessary for complete enclosure of the fluid droplet, and hence successful self-assembly into a PV device, is determined and compared with the experimental results and mechanics model predictions. The results from the modeling and energy approaches and the experiments are all consistent. Superhydrophobic surfaces, which have unique properties including self-cleaning and water repelling are desired in many applications. One excellent example in nature is the lotus leaf. To fabricate these surfaces, well designed micro/nano- surface structures are often employed. In this research, we fabricate superhydrophobic micropatterned Polydimethylsiloxane (PDMS) surfaces composed of micropillars of various sizes and arrangements by means of soft lithography. Both anisotropic surfaces, consisting of parallel grooves and cylindrical pillars in rectangular lattices, and isotropic surfaces, consisting of cylindrical pillars in square and hexagonal lattices, are considered. A novel technique is proposed to image the contact line (CL) of the droplet on the hydrophobic surface. This technique provides a new approach to distinguish between partial and complete wetting. The contact area between droplet and microtextured surface is then measured for a droplet in the Cassie state, which is a state of partial wetting. The results show that although the droplet is in the Cassie state, the contact area does not necessarily follow Cassie model predictions. Moreover, the CL is not circular, and is affected by the micropatterns, in both isotropic and anisotropic cases. Thus, it is suggested that along with the contact angle — the typical parameter reported in literature quantifying wetting, the size and shape of the contact area should also be presented. This technique is employed to investigate the evolution of the CL on a hydrophobic micropatterned surface in the cases of: a single droplet impacting the micropatterned surface, two droplets coalescing on micropillars, and a receding droplet resting on the micropatterned surface. Another parameter which quantifies hydrophobicity is the contact angle hysteresis (CAH), which indicates the resistance of the surface to the sliding of a droplet with a given volume. The conventional methods of using advancing and receding angles or tilting stage to measure the resistance of the micropatterned surface are indirect, without mentioning the inaccuracy due to the discrete and stepwise motion of the CL on micropillars. A micronewton force sensor is utilized to directly measure the resisting force by dragging a droplet on a microtextured surface. Together with the proposed imaging technique, the evolution of the CL during sliding is also explored. It is found that, at the onset of sliding, the CL behaves as a linear elastic solid with a constant stiffness. Afterwards, the force first increases and then decreases and reaches a steady state, accompanied with periodic oscillations due to regular pinning and depinning of the CL. Both the maximum and steady state forces are primarily dependent on area fractions of the micropatterned surfaces in our experiment. The resisting force is found to be proportional to the number of pillars which pin the CL at the trailing edge, validating the assumption that the resistance mainly arises from the CL pinning at the trailing edge. In each pinning-and-depinning cycle during the steady state, the CL also shows linear elastic behavior but with a lower stiffness. The force variation and energy dissipation involved can also be determined. This novel method of measuring the resistance of the micropatterned surface elucidates the dependence on CL pinning and provides more insight into the mechanisms of CAH.
Resumo:
Abstract To what extent has citizenship been transformed under the New Labour government to include women as equal citizens? This chapter will examine New Labour’s record in terms of alternative conceptions of citizenship: a model based on equal obligations to paid work, a model based on recognising care and gender difference, and a model of universal citizenship, underpinning equal expectations of care work and paid work with rights to the resources needed for individuals to combine both. It will argue that, while New Labour has signed up to the EU resolution on work-life balance, which includes commitment to a ‘new social contract on gender’, and has significantly increased resources for care, obligations to work are at the heart of New Labour ideas of citizenship, with work conceived as paid employment: policies in practice have done more to bring women into employment than men into care. Women’s citizenship is still undermined – though less than under earlier governments - by these unequal obligations and their consequences in social rights.
Resumo:
Background The relationship between exposure to indoor aeroallergens in early life and subsequent eczema is unclear. We have previously failed to show any significant associations between early life exposure to house dust mite and cat fur allergens and either sensitization to these allergens or wheeze. We have also previously reported a lower prevalence of parent-reported, doctor-diagnosed eczema by age 2 years for children exposed to higher concentrations of house dust mite, but no other associations with other definitions of eczema or for exposure to cat allergen. Objectives To extend the exposure–response analysis of allergen exposure and eczema outcomes measured up to age 8 years, and to investigate the role of other genetic and environmental determinants. Methods A total of 593 children (92Æ4% of those eligible) born to all newly pregnant women attending one of three general practitioner surgeries in Ashford, Kent, were followed from birth to age 8 years. Concentrations of house dust mite and cat allergen were measured in dust samples collected from the home at 8 weeks after birth. The risk of subsequent eczema as defined by the U.K. diagnostic criteria was determined according to different levels (quintiles) of allergen exposure at birth. Results By age 8 years, 150 (25Æ3%) children had met the diagnostic criteria for eczema at least once. Visible flexural dermatitis was recorded at least once for 129 (28Æ0%). As in other studies, parental allergic history was positively associated with most eczema outcomes, as were higher maternal education and less crowded homes. No clear linear associations between early exposure to house dust mite or cat allergen were found, regardless of the definition of eczema used. The risk of eczema appeared to increase for the three lowest quintiles of house dust mite allergen exposure (odds ratio, OR 1Æ37 for third quintile compared with first), and then to fall for the two highest quintiles (OR 0Æ66 and 0Æ71) even after controlling for confounding factors. Conclusions The lack of any clear exposure–disease relationship between allergens in early life and subsequent eczema argues against allergen exposure being a major factor causing eczema. If the lower levels of eczema at higher levels of house dust mite are confirmed, then interventions aimed at reducing house dust mite in early infancy could paradoxically increase the risk of subsequent eczema.
Resumo:
Gap junction coupling is ubiquitous in the brain, particularly between the dendritic trees of inhibitory interneurons. Such direct non-synaptic interaction allows for direct electrical communication between cells. Unlike spike-time driven synaptic neural network models, which are event based, any model with gap junctions must necessarily involve a single neuron model that can represent the shape of an action potential. Indeed, not only do neurons communicating via gaps feel super-threshold spikes, but they also experience, and respond to, sub-threshold voltage signals. In this chapter we show that the so-called absolute integrate-and-fire model is ideally suited to such studies. At the single neuron level voltage traces for the model may be obtained in closed form, and are shown to mimic those of fast-spiking inhibitory neurons. Interestingly in the presence of a slow spike adaptation current the model is shown to support periodic bursting oscillations. For both tonic and bursting modes the phase response curve can be calculated in closed form. At the network level we focus on global gap junction coupling and show how to analyze the asynchronous firing state in large networks. Importantly, we are able to determine the emergence of non-trivial network rhythms due to strong coupling instabilities. To illustrate the use of our theoretical techniques (particularly the phase-density formalism used to determine stability) we focus on a spike adaptation induced transition from asynchronous tonic activity to synchronous bursting in a gap-junction coupled network.
Resumo:
Planar cell polarity (PCP) occurs in the epithelia of many animals and can lead to the alignment of hairs, bristles and feathers; physiologically, it can organise ciliary beating. Here we present two approaches to modelling this phenomenon. The aim is to discover the basic mechanisms that drive PCP, while keeping the models mathematically tractable. We present a feedback and diffusion model, in which adjacent cell sides of neighbouring cells are coupled by a negative feedback loop and diffusion acts within the cell. This approach can give rise to polarity, but also to period two patterns. Polarisation arises via an instability provided a sufficiently strong feedback and sufficiently weak diffusion. Moreover, we discuss a conservative model in which proteins within a cell are redistributed depending on the amount of proteins in the neighbouring cells, coupled with intracellular diffusion. In this case polarity can arise from weakly polarised initial conditions or via a wave provided the diffusion is weak enough. Both models can overcome small anomalies in the initial conditions. Furthermore, the range of the effects of groups of cells with different properties than the surrounding cells depends on the strength of the initial global cue and the intracellular diffusion.
Resumo:
The fruit is one of the most complex and important structures produced by flowering plants, and understanding the development and maturation process of fruits in different angiosperm species with diverse fruit structures is of immense interest. In the work presented here, molecular genetics and genomic analysis are used to explore the processes that form the fruit in two species: The model organism Arabidopsis and the diploid strawberry Fragaria vesca. One important basic question concerns the molecular genetic basis of fruit patterning. A long-standing model of Arabidopsis fruit (the gynoecium) patterning holds that auxin produced at the apex diffuses downward, forming a gradient that provides apical-basal positional information to specify different tissue types along the gynoecium’s length. The proposed gradient, however, has never been observed and the model appears inconsistent with a number of observations. I present a new, alternative model, wherein auxin acts to establish the adaxial-abaxial domains of the carpel primordia, which then ensures proper development of the final gynoecium. A second project utilizes genomics to identify genes that regulate fruit color by analyzing the genome sequences of Fragaria vesca, a species of wild strawberry. Shared and distinct SNPs among three F. vesca accessions were identified, providing a foundation for locating candidate mutations underlying phenotypic variations among different F. vesca accessions. Through systematic analysis of relevant SNP variants, a candidate SNP in FveMYB10 was identified that may underlie the fruit color in the yellow-fruited accessions, which was subsequently confirmed by functional assays. Our lab has previously generated extensive RNA-sequencing data that depict genome-scale gene expression profiles in F. vesca fruit and flower tissues at different developmental stages. To enhance the accessibility of this dataset, the web-based eFP software was adapted for this dataset, allowing visualization of gene expression in any tissues by user-initiated queries. Together, this thesis work proposes a well-supported new model of fruit patterning in Arabidopsis and provides further resources for F. vesca, including genome-wide variant lists and the ability to visualize gene expression. This work will facilitate future work linking traits of economic importance to specific genes and gaining novel insights into fruit patterning and development.
Resumo:
Business journalism comes under persistent criticism for serving its historic readership of brokers and business people while lacking sufficient autonomy and failing to sufficiently question or challenge powerful corporate and economic interests. This is a dominant theme in media criticism of the savings and loan crisis and 2008 financial crisis. Against this backdrop, this dissertation asks: Is this critique valid, and if so, how can business journalism improve? To engage these questions, this dissertation examines the question of autonomy in business journalism in an unlikely place: the trade press. The central case study is coverage of the savings and loan crisis by the National Thrift News, a small financial services newspaper that won a George Polk award for its reporting in 1988. How could a small trade newspaper succeed in some instances when larger news organizations failed to connect the dots? The National Thrift News created a newsroom environment that celebrated reporter autonomy and independence. In some cases, it used its insider knowledge and consistent beat reporting to serve both its core readers and the broader society by uncovering savings and loan corruption. This study will highlight a long-running debate among theorists of journalistic professionalism by arguing that the commercial and advertising model in journalism does not inevitably compromise journalistic independence but rather can help pave a way forward for a more independent press. It therefore challenges the political economy critique of journalism, which holds that external forces such as capitalism harm press independence. This case suggests journalistic independence and individual agency remain powerful forces in newsrooms. Lastly, the dissertation argues that in an era of media downsizing, the trade press can perform an even more useful watchdog role over industry if the mainstream news media acknowledges and pursues some of the innovative trade reporting.