882 resultados para Relative complexity
Resumo:
The relative abundances of DNA of Mycosphaerella graminicola and Phaeosphaeria nodorum in archived wheat samples are closely correlated with UK anthropogenic emissions of oxidized sulphur over the last 160 years. To test whether this could be a causal relationship, possible modes of action of sulphur on the two fungi were examined. Mycelial growth of the two fungi in solutions of sulphurous acid was similar. Sulphurous acid at pH 4 reduced percentage germination of P. nodorum conidia more strongly than M. graminicola conidia. In spray inoculations of wheat cv. Squarehead’s Master, Cappelle Desprez and Riband with water or sulphurous acid (pH 4), the ratio of leaves infected by P. nodorum to leaves infected by M. graminicola was increased by factors of 2.5, 2.1 and 0.6, respectively at pH 4. The same three cultivars of wheat were grown in sand and vermiculite and fertilized with nutrient solution containing 2.5 or 0.5 mM sulphate. Both pathogens infected less frequently at 2.5 mM sulphate, by a factor of about 2. The severity of infection by M. graminicola was reduced on all three cultivars by a factor of about 4-5 at 2.5mM sulphate, but severity of P. nodorum was reduced only by a factor of about 2. Both elevated free sulphate concentrations in soil and sulphite in rainwater could therefore increase the prevalence of P. nodorum relative to M. graminicola, which is consistent with the historical changes in abundance
Resumo:
Healthcare information systems have the potential to enhance productivity, lower costs, and reduce medication errors by automating business processes. However, various issues such as system complexity and system abilities in a relation to user requirements as well as rapid changes in business needs have an impact on the use of these systems. In many cases failure of a system to meet business process needs has pushed users to develop alternative work processes (workarounds) to fill this gap. Some research has been undertaken on why users are motivated to perform and create workarounds. However, very little research has assessed the consequences on patient safety. Moreover, the impact of performing these workarounds on the organisation and how to quantify risks and benefits is not well analysed. Generally, there is a lack of rigorous understanding and qualitative and quantitative studies on healthcare IS workarounds and their outcomes. This project applies A Normative Approach for Modelling Workarounds to develop A Model of Motivation, Constraints, and Consequences. It aims to understand the phenomenon in-depth and provide guidelines to organisations on how to deal with workarounds. Finally the method is demonstrated on a case study example and its relative merits discussed.
Resumo:
A recently proposed mean-field theory of mammalian cortex rhythmogenesis describes the salient features of electrical activity in the cerebral macrocolumn, with the use of inhibitory and excitatory neuronal populations (Liley et al 2002). This model is capable of producing a range of important human EEG (electroencephalogram) features such as the alpha rhythm, the 40 Hz activity thought to be associated with conscious awareness (Bojak & Liley 2007) and the changes in EEG spectral power associated with general anesthetic effect (Bojak & Liley 2005). From the point of view of nonlinear dynamics, the model entails a vast parameter space within which multistability, pseudoperiodic regimes, various routes to chaos, fat fractals and rich bifurcation scenarios occur for physiologically relevant parameter values (van Veen & Liley 2006). The origin and the character of this complex behaviour, and its relevance for EEG activity will be illustrated. The existence of short-lived unstable brain states will also be discussed in terms of the available theoretical and experimental results. A perspective on future analysis will conclude the presentation.
Resumo:
This article reviews the use of complexity theory in planning theory using the theory of metaphors for theory transfer and theory construction. The introduction to the article presents the author's positioning of planning theory. The first section thereafter provides a general background of the trajectory of development of complexity theory and discusses the rationale of using the theory of metaphors for evaluating the use of complexity theory in planning. The second section introduces the workings of metaphors in general and theory-constructing metaphors in particular, drawing out an understanding of how to proceed with an evaluative approach towards an analysis of the use of complexity theory in planning. The third section presents two case studies – reviews of two articles – to illustrate how the framework might be employed. It then discusses the implications of the evaluation for the question ‘can complexity theory contribute to planning?’ The concluding section discusses the employment of the ‘theory of metaphors’ for evaluating theory transfer and draws out normative suggestions for engaging in theory transfer using the metaphorical route.
Resumo:
One central question in the formal linguistic study of adult multilingual morphosyntax (i.e., L3/Ln acquisition) involves determining the role(s) the L1 and/or the L2 play(s) at the L3 initial state (e.g., Bardel & Falk, Second Language Research 23: 459–484, 2007; Falk & Bardel, Second Language Research: forthcoming; Flynn et al., The International Journal of Multilingualism 8: 3–16, 2004; Rothman, Second Language Research: forthcoming; Rothman & Cabrelli, On the initial state of L3 (Ln) acquisition: Selective or absolute transfer?: 2007; Rothman & Cabrelli Amaro, Second Language Research 26: 219–289, 2010). The present article adds to this general program, testing Rothman's (Second Language Research: forthcoming) model for L3 initial state transfer, which when relevant in light of specific language pairings, maintains that typological proximity between the languages is the most deterministic variable determining the selection of syntactic transfer. Herein, I present empirical evidence from the later part of the beginning stages of L3 Brazilian Portuguese (BP) by native speakers of English and Spanish, who have attained an advanced level of proficiency in either English or Spanish as an L2. Examining the related domains of syntactic word order and relative clause attachment preference in L3 BP, the data clearly indicate that Spanish is transferred for both experimental groups irrespective of whether it was the L1 or L2. These results are expected by Rothman's (Second Language Research: forthcoming) model, but not necessarily predicted by other current hypotheses of multilingual syntactic transfer; the implications of this are discussed.
Resumo:
Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.
Resumo:
The relative contributions of five variables (Stereoscopy, screen size, field of view, level of realism and level of detail) of virtual reality systems on spatial comprehension and presence are evaluated here. Using a variable-centered approach instead of an object-centric view as its theoretical basis, the contributions of these five variables and their two-way interactions are estimated through a 25-1 fractional factorial experiment (screening design) of resolution V with 84 subjects. The experiment design, procedure, measures used, creation of scales and indices, results of statistical analysis, their meaning and agenda for future research are elaborated.
Resumo:
Prism is a modular classification rule generation method based on the ‘separate and conquer’ approach that is alternative to the rule induction approach using decision trees also known as ‘divide and conquer’. Prism often achieves a similar level of classification accuracy compared with decision trees, but tends to produce a more compact noise tolerant set of classification rules. As with other classification rule generation methods, a principle problem arising with Prism is that of overfitting due to over-specialised rules. In addition, over-specialised rules increase the associated computational complexity. These problems can be solved by pruning methods. For the Prism method, two pruning algorithms have been introduced recently for reducing overfitting of classification rules - J-pruning and Jmax-pruning. Both algorithms are based on the J-measure, an information theoretic means for quantifying the theoretical information content of a rule. Jmax-pruning attempts to exploit the J-measure to its full potential because J-pruning does not actually achieve this and may even lead to underfitting. A series of experiments have proved that Jmax-pruning may outperform J-pruning in reducing overfitting. However, Jmax-pruning is computationally relatively expensive and may also lead to underfitting. This paper reviews the Prism method and the two existing pruning algorithms above. It also proposes a novel pruning algorithm called Jmid-pruning. The latter is based on the J-measure and it reduces overfitting to a similar level as the other two algorithms but is better in avoiding underfitting and unnecessary computational effort. The authors conduct an experimental study on the performance of the Jmid-pruning algorithm in terms of classification accuracy and computational efficiency. The algorithm is also evaluated comparatively with the J-pruning and Jmax-pruning algorithms.
Resumo:
An extensive off-line evaluation of the Noah/Single Layer Urban Canopy Model (Noah/SLUCM) urban land-surface model is presented using data from 15 sites to assess (1) the ability of the scheme to reproduce the surface energy balance observed in a range of urban environments, including seasonal changes, and (2) the impact of increasing complexity of input parameter information. Model performance is found to be most dependent on representation of vegetated surface area cover; refinement of other parameter values leads to smaller improvements. Model biases in net all-wave radiation and trade-offs between turbulent heat fluxes are highlighted using an optimization algorithm. Here we use the Urban Zones to characterize Energy partitioning (UZE) as the basis to assign default SLUCM parameter values. A methodology (FRAISE) to assign sites (or areas) to one of these categories based on surface characteristics is evaluated. Using three urban sites from the Basel Urban Boundary Layer Experiment (BUBBLE) dataset, an independent evaluation of the model performance with the parameter values representative of each class is performed. The scheme copes well with both seasonal changes in the surface characteristics and intra-urban heterogeneities in energy flux partitioning, with RMSE performance comparable to similar state-of-the-art models for all fluxes, sites and seasons. The potential of the methodology for high-resolution atmospheric modelling application using the Weather Research and Forecasting (WRF) model is highlighted. This analysis supports the recommendations that (1) three classes are appropriate to characterize the urban environment, and (2) that the parameter values identified should be adopted as default values in WRF.
Resumo:
This research report was commissioned by the DETR and examines valuation issues relating to leasehold enfanchisement and lease extension - the right for flat owners to collectively purchase the freehold or buy a longer lease. The two factors examined examined in detail are the yield to be applied when capitalising the ground rent and the relative value of leases with a relatively short period left to run as against the value of the freehold or a new long lease, which determines the level of 'marriage level'. The research report will be of interest to all those involved in the valuation of residential leasehold property and those with an interest in legislative proposals for leasehold reform.
Resumo:
Dairy intake, despite its high saturated fatty acid (SFA) content, is associated with a lower risk of cardiovascular disease and diabetes. This in vitro study determined the effect of individual fatty acids (FA) found in dairy, and FA mixtures representative of a high SFA and a low SFA dairy lipid on markers of endothelial function in healthy and type II diabetic aortic endothelial cells.
Resumo:
Snow provides large seasonal storage of freshwater, and information about the distribution of snow mass as Snow Water Equivalent (SWE) is important for hydrological planning and detecting climate change impacts. Large regional disagreements remain between estimates from reanalyses, remote sensing and modelling. Assimilating passive microwave information improves SWE estimates in many regions but the assimilation must account for how microwave scattering depends on snow stratigraphy. Physical snow models can estimate snow stratigraphy, but users must consider the computational expense of model complexity versus acceptable errors. Using data from the National Aeronautics and Space Administration Cold Land Processes Experiment (NASA CLPX) and the Helsinki University of Technology (HUT) microwave emission model of layered snowpacks, it is shown that simulations of the brightness temperature difference between 19 GHz and 37 GHz vertically polarised microwaves are consistent with Advanced Microwave Scanning Radiometer-Earth Observing System (AMSR-E) and Special Sensor Microwave Imager (SSM/I) retrievals once known stratigraphic information is used. Simulated brightness temperature differences for an individual snow profile depend on the provided stratigraphic detail. Relative to a profile defined at the 10 cm resolution of density and temperature measurements, the error introduced by simplification to a single layer of average properties increases approximately linearly with snow mass. If this brightness temperature error is converted into SWE using a traditional retrieval method then it is equivalent to ±13 mm SWE (7% of total) at a depth of 100 cm. This error is reduced to ±5.6 mm SWE (3 % of total) for a two-layer model.
Resumo:
The central sector of the last British–Irish Ice Sheet (BIIS) was characterised by considerable complexity, both in terms of its glacial stratigraphy and geomorphological signature. This complexity is reflected by the large number and long history of papers that have attempted to decipher the glaciodynamic history of the region. Despite significant advances in our understanding, reconstructions remain hotly debated and relatively local, thereby hindering attempts to piece together BIIS dynamics. This paper seeks to address these issues by reviewing geomorphological mapping evidence of palimpsest flow signatures and providing an up-to-date stratigraphy of the region. Reconciling geomorphological and sedimentological evidence with relative and absolute dating constraints has allowed us to develop a new six-stage glacial model of ice-flow history and behaviour in the central sector of the last BIIS, with three major phases of glacial advance. This includes: I. Eastwards ice flow through prominent topographic corridors of the north Pennines; II. Cessation of the Stainmore ice flow pathway and northwards migration of the North Irish Sea Basin ice divide; III. Stagnation and retreat of the Tyne Gap Ice Stream; IV. Blackhall Wood–Gosforth Oscillation; V. Deglaciation of the Solway Lowlands; and VI. Scottish Re-advance and subsequent final retreat of ice out of the central sector of the last BIIS. The ice sheet was characterised by considerable dynamism, with flow switches, initiation (and termination) of ice streams, draw-down of ice into marine ice streams, repeated ice-marginal fluctuations and the production of large volumes of meltwater, locally impounded to form ice-dammed glacial lakes. Significantly, we tie this reconstruction to work carried out and models developed for the entire ice sheet. This therefore situates research in the central sector within contemporary understanding of how the last BIIS evolved over time.