40 resultados para simplicity
em CentAUR: Central Archive University of Reading - UK
Resumo:
What is the relationship between magnitude judgments relying on directly available characteristics versus probabilistic cues? Question frame was manipulated in a comparative judgment task previously assumed to involve inference across a probabilistic mental model (e.g., “which city is largest” – the “larger” question – versus “which city is smallest” – the “smaller” question). Participants identified either the largest or smallest city (Experiments 1a, 2) or the richest or poorest person (Experiment 1b) in a three-alternative forced choice (3-AFC) task (Experiment 1) or 2-AFC task (Experiment 2). Response times revealed an interaction between question frame and the number of options recognized. When asked the smaller question, response times were shorter when none of the options were recognized. The opposite pattern was found when asked the larger question: response time was shorter when all options were recognized. These task-stimuli congruity results in judgment under uncertainty are consistent with, and predicted by, theories of magnitude comparison which make use of deductive inferences from declarative knowledge.
Resumo:
Grassroots innovations (GI) are promising examples of deliberate transformation of socio-technical systems towards resilience and sustainability. However, evidence is needed on the factors that limit or enable their success. This paper set out to study how GI use narratives to empower innovation in the face of incumbent socio-technical regimes. Institutional documents were comparatively analyzed to assess how the narratives influence the structure, form of action and external interactions of two Italian grassroots networks, Bilanci di Giustizia and Transition Network Italy. The paper finds an internal consistency between narratives and strategies for each of the two networks. The paper also highlights core similarities, but also significant differences in the ethical basis of the two narratives, and in the organizations and strategies. Such differences determine different forms of innovation empowerment and expose the niche to different potentials to transform incumbent regimes, or to the risk of being co-opted by them.
Resumo:
The complexity inherent in climate data makes it necessary to introduce more than one statistical tool to the researcher to gain insight into the climate system. Empirical orthogonal function (EOF) analysis is one of the most widely used methods to analyze weather/climate modes of variability and to reduce the dimensionality of the system. Simple structure rotation of EOFs can enhance interpretability of the obtained patterns but cannot provide anything more than temporal uncorrelatedness. In this paper, an alternative rotation method based on independent component analysis (ICA) is considered. The ICA is viewed here as a method of EOF rotation. Starting from an initial EOF solution rather than rotating the loadings toward simplicity, ICA seeks a rotation matrix that maximizes the independence between the components in the time domain. If the underlying climate signals have an independent forcing, one can expect to find loadings with interpretable patterns whose time coefficients have properties that go beyond simple noncorrelation observed in EOFs. The methodology is presented and an application to monthly means sea level pressure (SLP) field is discussed. Among the rotated (to independence) EOFs, the North Atlantic Oscillation (NAO) pattern, an Arctic Oscillation–like pattern, and a Scandinavian-like pattern have been identified. There is the suggestion that the NAO is an intrinsic mode of variability independent of the Pacific.
Resumo:
The Human Development Index (HDI) introduced by the United Nations Development Programme (UNDP) in 1990 has helped facilitate widespread debate amongst development researchers, practitioners and policy makers. The HDI is an aggregate index, calculated on an annual basis by the UNDP and published in its Human Development Reports, comprising measures of three components deemed by them to be central to development: W income (the gross domestic product per capita), (ii) education (adult literacy rate) and (iii) health (life expectancy at birth). The results of calculating the HDI are typically presented as country/regional league tables, and provide a quick means for policy makers and others to judge performance. Perhaps partly because of the relative simplicity of the index, the HDI has managed to achieve a level of acceptance and use amongst politicians and policy makers that has yet to emerge with any indicator of sustainability. Indeed, despite its existence for 11 years, including nine years after the Rio Earth Summit, the HDI has not even been modified to take on board wider issues of sustainability. This paper will critically examine the potential for 'greening' the HDI so as to include environmental and resource-consumption dimensions. Copyright (C) 2003 John Wiley & Sons, Ltd and ERP Environment.
Resumo:
Empirical orthogonal functions (EOFs) are widely used in climate research to identify dominant patterns of variability and to reduce the dimensionality of climate data. EOFs, however, can be difficult to interpret. Rotated empirical orthogonal functions (REOFs) have been proposed as more physical entities with simpler patterns than EOFs. This study presents a new approach for finding climate patterns with simple structures that overcomes the problems encountered with rotation. The method achieves simplicity of the patterns by using the main properties of EOFs and REOFs simultaneously. Orthogonal patterns that maximise variance subject to a constraint that induces a form of simplicity are found. The simplified empirical orthogonal function (SEOF) patterns, being more 'local'. are constrained to have zero loadings outside the main centre of action. The method is applied to winter Northern Hemisphere (NH) monthly mean sea level pressure (SLP) reanalyses over the period 1948-2000. The 'simplified' leading patterns of variability are identified and compared to the leading patterns obtained from EOFs and REOFs. Copyright (C) 2005 Royal Meteorological Society.
Resumo:
A simple physical model of the atmospheric effects of large explosive volcanic eruptions is developed. Using only one input parameter - the initial amount of sulphur dioxide injected into the stratosphere - the global-average stratospheric optical-depth perturbation and surface temperature response are modelled. The simplicity of this model avoids issues of incomplete data (applicable to more comprehensive models), making it a powerful and useful tool for atmospheric diagnostics of this climate forcing mechanism. It may also provide a computationally inexpensive and accurate way of introducing volcanic activity into larger climate models. The modelled surface temperature response for an initial sulphur-dioxide injection, coupled with emission-history statistics, is used to demonstrate that the most climatically significant volcanic eruptions are those of sufficient explosivity to just reach into the stratosphere (and achieve longevity). This study also highlights the fact that this measure of significance is highly sensitive to the representation of the climatic response and the frequency data used, and that we are far from producing a definitive history of explosive volcanism for at least the past 1000 years. Given this high degree of uncertainty, these results suggest that eruptions that release around and above 0.1 Mt SO2 into the stratosphere have the maximum climatic impact.
Resumo:
Intercontinental Transport of Ozone and Precursors (ITOP) (part of International Consortium for Atmospheric Research on Transport and Transformation (ICARTT)) was an intense research effort to measure long-range transport of pollution across the North Atlantic and its impact on O3 production. During the aircraft campaign plumes were encountered containing large concentrations of CO plus other tracers and aerosols from forest fires in Alaska and Canada. A chemical transport model, p-TOMCAT, and new biomass burning emissions inventories are used to study the emissions long-range transport and their impact on the troposphere O3 budget. The fire plume structure is modeled well over long distances until it encounters convection over Europe. The CO values within the simulated plumes closely match aircraft measurements near North America and over the Atlantic and have good agreement with MOPITT CO data. O3 and NOx values were initially too great in the model plumes. However, by including additional vertical mixing of O3 above the fires, and using a lower NO2/CO emission ratio (0.008) for boreal fires, O3 concentrations are reduced closer to aircraft measurements, with NO2 closer to SCIAMACHY data. Too little PAN is produced within the simulated plumes, and our VOC scheme's simplicity may be another reason for O3 and NOx model-data discrepancies. In the p-TOMCAT simulations the fire emissions lead to increased tropospheric O3 over North America, the north Atlantic and western Europe from photochemical production and transport. The increased O3 over the Northern Hemisphere in the simulations reaches a peak in July 2004 in the range 2.0 to 6.2 Tg over a baseline of about 150 Tg.
Resumo:
Self-Organizing Map (SOM) algorithm has been extensively used for analysis and classification problems. For this kind of problems, datasets become more and more large and it is necessary to speed up the SOM learning. In this paper we present an application of the Simulated Annealing (SA) procedure to the SOM learning algorithm. The goal of the algorithm is to obtain fast learning and better performance in terms of matching of input data and regularity of the obtained map. An advantage of the proposed technique is that it preserves the simplicity of the basic algorithm. Several tests, carried out on different large datasets, demonstrate the effectiveness of the proposed algorithm in comparison with the original SOM and with some of its modification introduced to speed-up the learning.
Resumo:
This review article addresses recent advances in the analysis of foods and food components by capillary electrophoresis (CE). CE has found application to a number of important areas of food analysis, including quantitative chemical analysis of food additives, biochemical analysis of protein composition, and others. The speed, resolution and simplicity of CE, combined with low operating costs, make the technique an attractive option for the development of improved methods of food analysis for the new millennium.
Resumo:
The shamba system involves farmers tending tree saplings on state-owned forest land in return for being permitted to intercrop perennial food crops until canopy closure. At one time the system was used throughout all state-owned forest lands in Kenya, accounting for a large proportion of some 160,000 ha. The system should theoretically be mutually beneficial to both local people and the government. However the system has had a chequered past in Kenya due to widespread malpractice and associated environmental degradation. It was last banned in 2003 but in early 2008 field trials were initiated for its reintroduction. This study aimed to: assess the benefits and limitations of the shamba system in Kenya; assess the main influences on the extent to which the limitations and benefits are realised and; consider the management and policy requirements for the system's successful and sustainable operation. Information was obtained from 133 questionnaires using mainly open ended questions and six participatory workshops carried out in forest-adjacent communities on the western slopes of Mount Kenya in Nyeri district. In addition interviews were conducted with key informants from communities and organisations. There was strong desire amongst local people for the system's reintroduction given that it had provided significant food, income and employment. Local perceptions of the failings of the system included firstly mismanagement by government or forest authorities and secondly abuse of the system by shamba farmers and outsiders. Improvements local people considered necessary for the shamba system to work included more accountability and transparency in administration and better rules with respect to plot allocation and stewardship. Ninety-seven percent of respondents said they would like to be more involved in management of the forest and 80% that they were willing to pay for the use of a plot. The study concludes that the structural framework laid down by the 2005 Forests Act, which includes provision for the reimplementation of the shamba system under the new plantation establishment and livelihood improvement scheme (PELIS) [It should be noted that whilst the shamba system was re-branded in 2008 under the acronym PELIS, for the sake of simplicity the authors continue to refer to the 'shamba system' and 'shamba farmers' throughout this paper.], is weakened because insufficient power is likely to be devolved to local people, casting them merely as 'forest users' and the shamba system as a 'forest user right'. In so doing the system's potential to both facilitate and embody the participation of local people in forest management is limited and the long-term sustainability of the new system is questionable. Suggested instruments to address this include some degree of sharing of profits from forest timber, performance related guarantees for farmers to gain a new plot and use of joint committees consisting of local people and the forest authorities for long term management of forests.
Resumo:
The systematic sampling (SYS) design (Madow and Madow, 1944) is widely used by statistical offices due to its simplicity and efficiency (e.g., Iachan, 1982). But it suffers from a serious defect, namely, that it is impossible to unbiasedly estimate the sampling variance (Iachan, 1982) and usual variance estimators (Yates and Grundy, 1953) are inadequate and can overestimate the variance significantly (Särndal et al., 1992). We propose a novel variance estimator which is less biased and that can be implemented with any given population order. We will justify this estimator theoretically and with a Monte Carlo simulation study.
Resumo:
Graphical tracking is a technique for crop scheduling where the actual plant state is plotted against an ideal target curve which encapsulates all crop and environmental characteristics. Management decisions are made on the basis of the position of the actual crop against the ideal position. Due to the simplicity of the approach it is possible for graphical tracks to be developed on site without the requirement for controlled experimentation. Growth models and graphical tracks are discussed, and an implementation of the Richards curve for graphical tracking described. In many cases, the more intuitively desirable growth models perform sub-optimally due to problems with the specification of starting conditions, environmental factors outside the scope of the original model and the introduction of new cultivars. Accurate specification for a biological model requires detailed and usually costly study, and as such is not adaptable to a changing cultivar range and changing cultivation techniques. Fitting of a new graphical track for a new cultivar can be conducted on site and improved over subsequent seasons. Graphical tracking emphasises the current position relative to the objective, and as such does not require the time consuming or system specific input of an environmental history, although it does require detailed crop measurement. The approach is flexible and could be applied to a variety of specification metrics, with digital imaging providing a route for added value. For decision making regarding crop manipulation from the observed current state, there is a role for simple predictive modelling over the short term to indicate the short term consequences of crop manipulation.
Resumo:
MS is an important analytical tool in clinical proteomics, primarily in the disease-specific discovery identification and characterisation of proteomic biomarkers and patterns. MS-based proteomics is increasingly used in clinical validation and diagnostic method development. The latter departs from the typical application of MS-based proteomics by exchanging some of the high performance of analysis for the throughput, robustness and simplicity required for clinical diagnostics. Although conventional MS-based proteomics has become an important field in clinical applications, some of the most recent MS technologies have not yet been extensively applied in clinical proteomics. in this review, we will describe the current state of MS in clinical proteomics and look to the future of this field.
Resumo:
This paper addresses the impact of imperfect synchronisation on D-STBC when combined with incremental relay. To suppress such an impact, a novel detection scheme is proposed, which retains the two key features of the STBC principle: simplicity (i.e. linear computational complexity), and optimality (i.e. maximum likelihood). These two features make the new detector very suitable for low power wireless networks (e.g. sensor networks).
Resumo:
Objectives. Theoretic modeling and experimental studies suggest that functional electrical stimulation (FES) can improve trunk balance in spinal cord injured subjects. This can have a positive impact on daily life, increasing the volume of bimanual workspace, improving sitting posture, and wheelchair propulsion. A closed loop controller for the stimulation is desirable, as it can potentially decrease muscle fatigue and offer better rejection to disturbances. This paper proposes a biomechanical model of the human trunk, and a procedure for its identification, to be used for the future development of FES controllers. The advantage over previous models resides in the simplicity of the solution proposed, which makes it possible to identify the model just before a stimulation session ( taking into account the variability of the muscle response to the FES). Materials and Methods. The structure of the model is based on previous research on FES and muscle physiology. Some details could not be inferred from previous studies, and were determined from experimental data. Experiments with a paraplegic volunteer were conducted in order to measure the moments exerted by the trunk-passive tissues and artificially stimulated muscles. Data for model identification and validation also were collected. Results. Using the proposed structure and identification procedure, the model could adequately reproduce the moments exerted during the experiments. The study reveals that the stimulated trunk extensors can exert maximal moment when the trunk is in the upright position. In contrast, previous studies show that able-bodied subjects can exert maximal trunk extension when flexed forward. Conclusions. The proposed model and identification procedure are a successful first step toward the development of a model-based controller for trunk FES. The model also gives information on the trunk in unique conditions, normally not observable in able-bodied subjects (ie, subject only to extensor muscles contraction).