918 resultados para automated model-based feedback
Resumo:
The MATLAB model is contained within the compressed folders (versions are available as .zip and .tgz). This model uses MERRA reanalysis data (>34 years available) to estimate the hourly aggregated wind power generation for a predefined (fixed) distribution of wind farms. A ready made example is included for the wind farm distribution of Great Britain, April 2014 ("CF.dat"). This consists of an hourly time series of GB-total capacity factor spanning the period 1980-2013 inclusive. Given the global nature of reanalysis data, the model can be applied to any specified distribution of wind farms in any region of the world. Users are, however, strongly advised to bear in mind the limitations of reanalysis data when using this model/data. This is discussed in our paper: Cannon, Brayshaw, Methven, Coker, Lenaghan. "Using reanalysis data to quantify extreme wind power generation statistics: a 33 year case study in Great Britain". Submitted to Renewable Energy in March, 2014. Additional information about the model is contained in the model code itself, in the accompanying ReadMe file, and on our website: http://www.met.reading.ac.uk/~energymet/data/Cannon2014/
Resumo:
In many lower-income countries, the establishment of marine protected areas (MPAs) involves significant opportunity costs for artisanal fishers, reflected in changes in how they allocate their labor in response to the MPA. The resource economics literature rarely addresses such labor allocation decisions of artisanal fishers and how, in turn, these contribute to the impact of MPAs on fish stocks, yield, and income. This paper develops a spatial bio-economic model of a fishery adjacent to a village of people who allocate their labor between fishing and on-shore wage opportunities to establish a spatial Nash equilibrium at a steady state fish stock in response to various locations for no-take zone MPAs and managed access MPAs. Villagers’ fishing location decisions are based on distance costs, fishing returns, and wages. Here, the MPA location determines its impact on fish stocks, fish yield, and villager income due to distance costs, congestion, and fish dispersal. Incorporating wage labor opportunities into the framework allows examination of the MPA’s impact on rural incomes, with results determining that win-wins between yield and stocks occur in very different MPA locations than do win-wins between income and stocks. Similarly, villagers in a high-wage setting face a lower burden from MPAs than do those in low-wage settings. Motivated by issues of central importance in Tanzania and Costa Rica, we impose various policies on this fishery – location specific no-take zones, increasing on-shore wages, and restricting MPA access to a subset of villagers – to analyze the impact of an MPA on fish stocks and rural incomes in such settings.
Resumo:
We study the exact solution of an N-state vertex model based on the representation of the U(q)[SU(2)] algebra at roots of unity with diagonal open boundaries. We find that the respective reflection equation provides us one general class of diagonal K-matrices having one free-parameter. We determine the eigenvalues of the double-row transfer matrix and the respective Bethe ansatz equation within the algebraic Bethe ansatz framework. The structure of the Bethe ansatz equation combine a pseudomomenta function depending on a free-parameter with scattering phase-shifts that are fixed by the roots of unity and boundary variables. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This paper describes the development of a new approach to the use of ICT for the teaching of courses in the interpretation and evaluation of evidence. It is based on ideas developed for the teaching of science to school children, in particular the importance of models and qualitative reasoning skills. In the first part, we make an analysis of the basis of current research into “evidence scholarship” and the demands such a system would have to meet. In the second part, we introduce the details of such a system that we developed initially to assist police in the interpretation of evidence.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Quark-model descriptions of the nucleon-nucleon interaction contain two main ingredients, a quark-exchange mechanism for the short-range repulsion and meson exchanges for the medium- and long-range parts of the interaction. We point out the special role played by higher partial waves, and in particular the (1)F(3), as a very sensitive probe for the meson-exchange pan employed in these interaction models. In particular, we show that the presently available models fail to provide a reasonable description of higher partial waves and indicate the reasons for this shortcoming.
Fluorescent lamp model based on equivalent resistances, considering the effects of dimming operation
Resumo:
This paper presents a new methodology for the determination of fluorescent lamp models based on equivalent resistances. One important feature of the proposed methodology is concerned with the inclusion of the filaments into the model, considering the effects of dimming operation on the equivalent resistances. The classical Series-Resonant Parallel-Loaded Half-Bridge inverter is used as the power stage of the ballast. Moreover, the variation of the inverter's switching frequency is the dimming technique assumed for the analyses. Results obtained with a F32T8 lamp indicate that the accuracy of the model is very satisfactory. Thus, the lamp models obtained with the proposed methodology have the potential to serve as an important tool for ballast designers, considering the necessity for evaluating the lamp/ballast compatibility, according to issues concerned to the operating conditions of the electrodes' filaments.
Resumo:
We consider a fully model-based approach for the analysis of distance sampling data. Distance sampling has been widely used to estimate abundance (or density) of animals or plants in a spatially explicit study area. There is, however, no readily available method of making statistical inference on the relationships between abundance and environmental covariates. Spatial Poisson process likelihoods can be used to simultaneously estimate detection and intensity parameters by modeling distance sampling data as a thinned spatial point process. A model-based spatial approach to distance sampling data has three main benefits: it allows complex and opportunistic transect designs to be employed, it allows estimation of abundance in small subregions, and it provides a framework to assess the effects of habitat or experimental manipulation on density. We demonstrate the model-based methodology with a small simulation study and analysis of the Dubbo weed data set. In addition, a simple ad hoc method for handling overdispersion is also proposed. The simulation study showed that the model-based approach compared favorably to conventional distance sampling methods for abundance estimation. In addition, the overdispersion correction performed adequately when the number of transects was high. Analysis of the Dubbo data set indicated a transect effect on abundance via Akaike’s information criterion model selection. Further goodness-of-fit analysis, however, indicated some potential confounding of intensity with the detection function.
Resumo:
In this paper, a modeling technique for small-signal stability assessment of unbalanced power systems is presented. Since power distribution systems are inherently unbalanced, due to its lines and loads characteristics, and the penetration of distributed generation into these systems is increasing nowadays, such a tool is needed in order to ensure a secure and reliable operation of these systems. The main contribution of this paper is the development of a phasor-based model for the study of dynamic phenomena in unbalanced power systems. Using an assumption on the net torque of the generator, it is possible to precisely define an equilibrium point for the phasor model of the system, thus enabling its linearization around this point, and, consequently, its eigenvalue/eigenvector analysis for small-signal stability assessment. The modeling technique presented here was compared to the dynamic behavior observed in ATP simulations and the results show that, for the generator and controller models used, the proposed modeling approach is adequate and yields reliable and precise results.
Resumo:
A neural network model to predict ozone concentration in the Sao Paulo Metropolitan Area was developed, based on average values of meteorological variables in the morning (8:00-12:00 hr) and afternoon (13:00-17: 00 hr) periods. Outputs are the maximum and average ozone concentrations in the afternoon (12:00-17:00 hr). The correlation coefficient between computed and measured values was 0.82 and 0.88 for the maximum and average ozone concentration, respectively. The model presented good performance as a prediction tool for the maximum ozone concentration. For prediction periods from 1 to 5 days 0 to 23% failures (95% confidence) were obtained.
Resumo:
Abstract Background Over the last years, a number of researchers have investigated how to improve the reuse of crosscutting concerns. New possibilities have emerged with the advent of aspect-oriented programming, and many frameworks were designed considering the abstractions provided by this new paradigm. We call this type of framework Crosscutting Frameworks (CF), as it usually encapsulates a generic and abstract design of one crosscutting concern. However, most of the proposed CFs employ white-box strategies in their reuse process, requiring two mainly technical skills: (i) knowing syntax details of the programming language employed to build the framework and (ii) being aware of the architectural details of the CF and its internal nomenclature. Also, another problem is that the reuse process can only be initiated as soon as the development process reaches the implementation phase, preventing it from starting earlier. Method In order to solve these problems, we present in this paper a model-based approach for reusing CFs which shields application engineers from technical details, letting him/her concentrate on what the framework really needs from the application under development. To support our approach, two models are proposed: the Reuse Requirements Model (RRM) and the Reuse Model (RM). The former must be used to describe the framework structure and the later is in charge of supporting the reuse process. As soon as the application engineer has filled in the RM, the reuse code can be automatically generated. Results We also present here the result of two comparative experiments using two versions of a Persistence CF: the original one, whose reuse process is based on writing code, and the new one, which is model-based. The first experiment evaluated the productivity during the reuse process, and the second one evaluated the effort of maintaining applications developed with both CF versions. The results show the improvement of 97% in the productivity; however little difference was perceived regarding the effort for maintaining the required application. Conclusion By using the approach herein presented, it was possible to conclude the following: (i) it is possible to automate the instantiation of CFs, and (ii) the productivity of developers are improved as long as they use a model-based instantiation approach.
Resumo:
The behavior of composed Web services depends on the results of the invoked services; unexpected behavior of one of the invoked services can threat the correct execution of an entire composition. This paper proposes an event-based approach to black-box testing of Web service compositions based on event sequence graphs, which are extended by facilities to deal not only with service behavior under regular circumstances (i.e., where cooperating services are working as expected) but also with their behavior in undesirable situations (i.e., where cooperating services are not working as expected). Furthermore, the approach can be used independently of artifacts (e.g., Business Process Execution Language) or type of composition (orchestration/choreography). A large case study, based on a commercial Web application, demonstrates the feasibility of the approach and analyzes its characteristics. Test generation and execution are supported by dedicated tools. Especially, the use of an enterprise service bus for test execution is noteworthy and differs from other approaches. The results of the case study encourage to suggest that the new approach has the power to detect faults systematically, performing properly even with complex and large compositions. Copyright © 2012 John Wiley & Sons, Ltd.