833 resultados para level of detail (LOD)
Resumo:
The Phosphorus Indicators Tool provides a catchment-scale estimation of diffuse phosphorus (P) loss from agricultural land to surface waters using the most appropriate indicators of P loss. The Tool provides a framework that may be applied across the UK to estimate P loss, which is sensitive not only to land use and management but also to environmental factors such as climate, soil type and topography. The model complexity incorporated in the P Indicators Tool has been adapted to the level of detail in the available data and the need to reflect the impact of changes in agriculture. Currently, the Tool runs on an annual timestep and at a 1 km(2) grid scale. We demonstrate that the P Indicators Tool works in principle and that its modular structure provides a means of accounting for P loss from one layer to the next, and ultimately to receiving waters. Trial runs of the Tool suggest that modelled P delivery to water approximates measured water quality records. The transparency of the structure of the P Indicators Tool means that identification of poorly performing coefficients is possible, and further refinements of the Tool can be made to ensure it is better calibrated and subsequently validated against empirical data, as it becomes available.
Resumo:
This paper deconstructs the relationship between the Environmental Sustainability Index (ESI) and national income. The ESI attempts to provide a single figure which encapsulates environmental sustainability' for each country included in the analysis, and this allied with a 'league table' format so as to name and shame bad performers, has resulted in widespread reporting within the popular presses of a number of countries. In essence, the higher the value of the ESI then the more 'environmentally sustainable' a country is deemed to be. A logical progression beyond the use of the ESI to publicise environmental sustainability is its use within a more analytical context. Thus an index designed to simplify in order to have an impact on policy is used to try and understand causes of good and bad performance in environmental sustainability. For example the creators of the ESI claim that ESI is related to GDP/capita (adjusted for Purchasing Power Parity) such that the ESI increases linearly with wealth. While this may in a sense be a comforting picture, do the variables within the ESI allow for alternatives to the story, and if they do then what are the repercussions for those producing such indices for broad consumption amongst the policy makers, mangers, the press, etc.? The latter point is especially important given the appetite for such indices amongst non-specialists, and for all their weaknesses the ESI and other such aggregated indices will not go away. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Nucleolin is a multi-functional protein that is located to the nucleolus. In tissue Culture cells, the stability of nucleolin is related to the proliferation status of the cell. During development, rat cardiomyocytes proliferate actively with increases in the mass of the heart being due to both hyperplasia and hypertrophy. The timing of this shift in the phenotype of the myocyte from one capable of undergoing hyperplasia to one that can grow only by hypertrophy occurs within 4 days of post-natal development. Thus, cardiomyocytes are an ideal model system in which to study the regulation of nucleolin during growth in vivo. Using Western blot and quantitative RT-PCR (TaqMan) we found that the amount of nucleolin is regulated both at the level of transcription and translation during the development of the cardiomyocyte. However, in cells which had exited the cell cycle and were subsequently given a hypertrophic stimulus, nucleolin was regulated post-transcriptionally. (c) 2005 Elsevier Inc. All rights reserved.
Resumo:
Potatoes of a number of varieties of contrasting levels of resistance were planted in pure or mixed stands in four experiments over 3 years. Three experiments compared the late blight severity and progress in mixtures with that in pure stands. Disease on susceptible or moderately resistant varieties typical of those in commercial use was similar in mixtures and pure stands. In 2 of 3 years, there were slight reductions on cv. Sante, which is moderately susceptible, in mixture with cv. Cara, which is moderately resistant. Cara was unaffected by this mixture. Mixtures of an immune or near-immune partner with Cara or Sante substantially reduced disease on the latter. The effect of the size of plots of individual varieties or mixtures on blight severity was compared in two experiments. Larger plots had a greater area under the disease progress curve, but the average rate of disease progress was greater in smaller plots; this may be because most disease progress took place later, under more favourable conditions, in the smaller plots. In one experiment, two planting densities were used. Density had no effect on disease and did not interact with mixture effects. The overall conclusion is that, while mixtures of potato varieties may be desirable for other reasons, they do not offer any improvement on the average of the disease resistance of the components.
Resumo:
The role of anterior cingulate cortex (ACC) in attention is a matter of debate. One hypothesis suggests that its role is to monitor response-level conflict, but explicit evidence is somewhat lacking. In this study, the activation of ACC was compared in (a) color and number standard Stroop tasks in which response preparation and interference shared modality (response-level conflict) and (b) color and number matching Stroop tasks in which response preparation and interference did not share modality (non-response-level conflict). In the congruent conditions, there was no effect of task type. In the interference conditions, anterior cingulate activity in the matching tasks was less than that in the standard tasks. These results support the hypothesis that ACC specifically mediates generalized modality-independent selection processes invoked by response competition.
Resumo:
Investigations of memory deficits in older individuals have concentrated on their increased likelihood of forgetting events or details of events that were actually encountered (errors of omission). However mounting evidence demonstrates that normal cognitive aging also is associated with an increased propensity for errors of commission-shown in false alarms or false recognition. The present study examined the origins of this age difference. Older and younger adults each performed three types of memory tasks in which details of encountered items might influence performance. Although older adults showed greater false recognition of related lures on a standard (identical) old/new episodic recognition task, older and younger adults showed parallel effects of detail on repetition priming and meaning-based episodic recognition (decreased priming and decreased meaning-based recognition for different relative to same exemplars). The results suggest that the older adults encoded details but used them less effectively than the younger adults in the recognition context requiring their deliberate, controlled use.
Resumo:
A speech message played several metres from the listener in a room is usually heard to have much the same phonetic content as it does when played nearby, even though the different amounts of reflected sound make the temporal envelopes of these signals very different. To study this ‘constancy’ effect, listeners heard speech messages and speech-like sounds comprising 8 auditory-filter shaped noise-bands that had temporal envelopes corresponding to those in these filters when the speech message is played. The ‘contexts’ were “next you’ll get _to click on”, into which a “sir” or “stir” test word was inserted. These test words were from an 11-step continuum, formed by amplitude modulation. Listeners identified the test words appropriately, even in the 8-band conditions where the speech had a ‘robotic’ quality. Constancy was assessed by comparing the influence of room reflections on the test word across conditions where the context had either the same level of room reflections (i.e. from the same, far distance), or where it had a much lower level (i.e. from nearby). Constancy effects were obtained with both the natural- and the 8-band speech. Results are considered in terms of the degree of ‘matching’ between the context’s and test-word’s bands.
Resumo:
The present paper summarizes the consensus views of a group of 9 European clinicians and scientists on the current state of scientific knowledge on probiotics, covering those areas where there is substantial evidence for beneficial effects and those where the evidence base is poor or inconsistent. There was general agreement that probiotic effects were species and often strain specific. The experts agreed that some probiotics were effective in reducing the incidence and duration of rotavirus diarrhoea in infants, antibiotic-associated diarrhoea in adults and, for certain probiotics, Clostridium difficile infections. Some probiotics are associated with symptomatic improvements in irritable bowel syndrome and alleviation of digestive discomfort. Probiotics can reduce the frequency and severity of necrotizing enterocolitis in premature infants and have been shown to regulate intestinal immunity. Several other clinical effects of probiotics, including their role in inflammatory bowel disease, atopic dermatitis, respiratory or genito-urinary infections or H.pylori adjuvant treatment were thought promising but inconsistent.
Resumo:
Classical measures of network connectivity are the number of disjoint paths between a pair of nodes and the size of a minimum cut. For standard graphs, these measures can be computed efficiently using network flow techniques. However, in the Internet on the level of autonomous systems (ASs), referred to as AS-level Internet, routing policies impose restrictions on the paths that traffic can take in the network. These restrictions can be captured by the valley-free path model, which assumes a special directed graph model in which edge types represent relationships between ASs. We consider the adaptation of the classical connectivity measures to the valley-free path model, where it is -hard to compute them. Our first main contribution consists of presenting algorithms for the computation of disjoint paths, and minimum cuts, in the valley-free path model. These algorithms are useful for ASs that want to evaluate different options for selecting upstream providers to improve the robustness of their connection to the Internet. Our second main contribution is an experimental evaluation of our algorithms on four types of directed graph models of the AS-level Internet produced by different inference algorithms. Most importantly, the evaluation shows that our algorithms are able to compute optimal solutions to instances of realistic size of the connectivity problems in the valley-free path model in reasonable time. Furthermore, our experimental results provide information about the characteristics of the directed graph models of the AS-level Internet produced by different inference algorithms. It turns out that (i) we can quantify the difference between the undirected AS-level topology and the directed graph models with respect to fundamental connectivity measures, and (ii) the different inference algorithms yield topologies that are similar with respect to connectivity and are different with respect to the types of paths that exist between pairs of ASs.
Resumo:
The transition to a low-carbon economy urgently demands better information on the drivers of energy consumption. UK government policy has prioritized energy efficiency in the built stock as a means of carbon reduction, but the sector is historically information poor, particularly the non-domestic building stock. This paper presents the results of a pilot study that investigated whether and how property and energy consumption data might be combined for non-domestic energy analysis. These data were combined in a ‘Non-Domestic Energy Efficiency Database’ to describe the location and physical attributes of each property and its energy consumption. The aim was to support the generation of a range of energy-efficiency statistics for the industrial, commercial and institutional sectors of the non-domestic building stock, and to provide robust evidence for national energy-efficiency and carbon-reduction policy development and monitoring. The work has brought together non-domestic energy data, property data and mapping in a ‘data framework’ for the first time. The results show what is possible when these data are integrated and the associated difficulties. A data framework offers the potential to inform energy-efficiency policy formation and to support its monitoring at a level of detail not previously possible.
Resumo:
Nematic monodomain liquid crystalline elastomers have been prepared through in situ cross-linking of an acrylate based side-chain liquid crystalline polymer in a magnetic field. At the nematic–isotropic transition, the sample is found to undergo an anisotropic shape change. There is found to be an increase in dimensions perpendicular — and a decrease parallel — to the director, this is consistent with alignment of the polymer backbone parallel to the direction of mesogen alignment in the nematic state. From a quantitative investigation of this behaviour, we estimate the level of backbone anisotropy for the elastomer. As second measure of the backbone anisotropy, the monodomain sample was physically extended. We have investigated, in particular, the situation where a monodomain sample is deformed with the angle between the director and the extension direction approaching 90°. The behaviour on extension of these acrylate samples is related to alternative theoretical interpretations and the backbone anisotropy determined. Comparison of the chain anisotropy derived from these two approaches and the value obtained from previous small-angle neutron scattering measurements on deuterium labelled mixtures of the same polymer shows that some level of chain anisotropy is retained in the isotropic or more strictly weakly paranematic state of the elastomer. The origin and implications of this behaviour are discussed.
Resumo:
Individual-level constructs are seldom taken into consideration in construction management research relating to project performance. This is antithetical to the objectives of properly conceptualizing and contextualizing the research we do because many project performance outcomes, such as the extent of cooperation and level of communication or teamwork are influenced and moderated by individuals’ perceptions, values and behaviour. A brief review of the literature in organizational studies centred on culture, identity, empowerment and trust is offered. These constructs are then explored in relation to project performance issues and outcomes, and it is noted that they are predominantly studied at the project and industry levels. We argue that focusing these constructs at the individual unit of analysis has significant implications for project performance and therefore their effects need to be systematically accounted for in explanations of the success and failure of projects. Far from being prescriptive, the aim is to generate interest and awareness for more focused research at the individual level of analysis in order to add new insights and perspectives to critical performance questions in construction management. To this end, a research agenda is outlined, arguing that construction management research integrating individual-level constructs and broader, macro-contextual issues will help define and enhance the legitimacy of the field.
Resumo:
Duchenne muscular dystrophy is a severe X-linked inherited muscle wasting disorder caused by mutations in the dystrophin gene. Adeno-associated virus (AAV) vectors have been extensively used to deliver genes efficiently for dystrophin expression in skeletal muscles. To overcome limited packaging capacity of AAV vectors (<5 kb), truncated recombinant microdystrophin genes with deletions of most of rod and carboxyl-terminal (CT) domains of dystrophin have been developed. We have previously shown the efficiency of mRNA sequence–optimized microdystrophin (ΔR4-23/ΔCT, called MD1) with deletion of spectrin-like repeat domain 4 to 23 and CT domain in ameliorating the pathology of dystrophic mdx mice. However, the CT domain of dystrophin is thought to recruit part of the dystrophin-associated protein complex, which acts as a mediator of signalling between extracellular matrix and cytoskeleton in muscle fibers. In this study, we extended the ΔR4-23/ΔCT microdystrophin by incorporating helix 1 of the coiled-coil motif in the CT domain of dystrophin (MD2), which contains the α1-syntrophin and α-dystrobrevin binding sites. Intramuscular injection of AAV2/9 expressing CT domain–extended microdystrophin showed efficient dystrophin expression in tibialis anterior muscles of mdx mice. The presence of the CT domain of dystrophin in MD2 increased the recruitment of α1-syntrophin and α-dystrobrevin at the sarcolemma and significantly improved the muscle resistance to lengthening contraction–induced muscle damage in the mdx mice compared with MD1. These results suggest that the incorporation of helix 1 of the coiled-coil motif in the CT domain of dystrophin to the microdystrophins will substantially improve their efficiency in restoring muscle function in patients with Duchenne muscular dystrophy.
Resumo:
Introduction. Feature usage is a pre-requisite to realising the benefits of investments in feature rich systems. We propose that conceptualising the dependent variable 'system use' as 'level of use' and specifying it as a formative construct has greater value for measuring the post-adoption use of feature rich systems. We then validate the content of the construct as a first step in developing a research instrument to measure it. The context of our study is the post-adoption use of electronic medical records (EMR) by primary care physicians. Method. Initially, a literature review of the empirical context defines the scope based on prior studies. Having identified core features from the literature, they are further refined with the help of experts in a consensus seeking process that follows the Delphi technique. Results.The methodology was successfully applied to EMRs, which were selected as an example of feature rich systems. A review of EMR usage and regulatory standards provided the feature input for the first round of the Delphi process. A panel of experts then reached consensus after four rounds, identifying ten task-based features that would be indicators of level of use. Conclusions. To study why some users deploy more advanced features than others, theories of post-adoption require a rich formative dependent variable that measures level of use. We have demonstrated that a context sensitive literature review followed by refinement through a consensus seeking process is a suitable methodology to validate the content of this dependent variable. This is the first step of instrument development prior to statistical confirmation with a larger sample.
Resumo:
Forgetting immediate physical reality and having awareness of one�s location in the simulated world is critical to enjoyment and performance in virtual environments be it an interactive 3D game such as Quake or an online virtual 3d community space such as Second Life. Answer to the question "where am I?" at two levels, whether the locus is in the immediate real world as opposed to the virtual world and whether one is aware of the spatial co-ordinates of that locus, hold the key to any virtual 3D experience. While 3D environments, especially virtual environments and their impact on spatial comprehension has been studied in disciplines such as architecture, it is difficult to determine the relative contributions of specific attributes such as screen size or stereoscopy towards spatial comprehension since most of them treat the technology as monolith (box-centered). Using a variable-centered approach put forth by Nass and Mason (1990) which breaks down the technology into its component variables and their corresponding values as its theoretical basis, this paper looks at the contributions of five variables (Stereoscopy, screen size, field of view, level of realism and level of detail) common to most virtual environments on spatial comprehension and presence. The variable centered approach can be daunting as the increase in the number of variables can exponentially increase the number of conditions and resources required. We overcome this drawback posed by adoption of such a theoretical approach by the use of a fractional factorial design for the experiment. This study has completed the first wave of data collection and starting the next phase in January 2007 and expected to complete by February 2007. Theoretical and practical implications of the study are discussed.