843 resultados para Conceptual graphs
Resumo:
The open provenance architecture (OPA) approach to the challenge was distinct in several regards. In particular, it is based on an open, well-defined data model and architecture, allowing different components of the challenge workflow to independently record documentation, and for the workflow to be executed in any environment. Another noticeable feature is that we distinguish between the data recorded about what has occurred, emphprocess documentation, and the emphprovenance of a data item, which is all that caused the data item to be as it is and is obtained as the result of a query over process documentation. This distinction allows us to tailor the system to separately best address the requirements of recording and querying documentation. Other notable features include the explicit recording of causal relationships between both events and data items, an interaction-based world model, intensional definition of data items in queries rather than relying on explicit naming mechanisms, and emphstyling of documentation to support non-functional application requirements such as reducing storage costs or ensuring privacy of data. In this paper we describe how each of these features aid us in answering the challenge provenance queries.
Resumo:
Climate model projections show that climate change will further increase the risk of flooding in many regions of the world. There is a need for climate adaptation, but building new infrastructure or additional retention basins has its limits, especially in densely populated areas where open spaces are limited. Another solution is the more efficient use of the existing infrastructure. This research investigates a method for real-time flood control by means of existing gated weirs and retention basins. The method was tested for the specific study area of the Demer basin in Belgium but is generally applicable. Today, retention basins along the Demer River are controlled by means of adjustable gated weirs based on fixed logic rules. However, because of the high complexity of the system, only suboptimal results are achieved by these rules. By making use of precipitation forecasts and combined hydrological-hydraulic river models, the state of the river network can be predicted. To fasten the calculation speed, a conceptual river model was used. The conceptual model was combined with a Model Predictive Control (MPC) algorithm and a Genetic Algorithm (GA). The MPC algorithm predicts the state of the river network depending on the positions of the adjustable weirs in the basin. The GA generates these positions in a semi-random way. Cost functions, based on water levels, were introduced to evaluate the efficiency of each generation, based on flood damage minimization. In the final phase of this research the influence of the most important MPC and GA parameters was investigated by means of a sensitivity study. The results show that the MPC-GA algorithm manages to reduce the total flood volume during the historical event of September 1998 by 46% in comparison with the current regulation. Based on the MPC-GA results, some recommendations could be formulated to improve the logic rules.
Resumo:
A procedure for characterizing global uncertainty of a rainfall-runoff simulation model based on using grey numbers is presented. By using the grey numbers technique the uncertainty is characterized by an interval; once the parameters of the rainfall-runoff model have been properly defined as grey numbers, by using the grey mathematics and functions it is possible to obtain simulated discharges in the form of grey numbers whose envelope defines a band which represents the vagueness/uncertainty associated with the simulated variable. The grey numbers representing the model parameters are estimated in such a way that the band obtained from the envelope of simulated grey discharges includes an assigned percentage of observed discharge values and is at the same time as narrow as possible. The approach is applied to a real case study highlighting that a rigorous application of the procedure for direct simulation through the rainfall-runoff model with grey parameters involves long computational times. However, these times can be significantly reduced using a simplified computing procedure with minimal approximations in the quantification of the grey numbers representing the simulated discharges. Relying on this simplified procedure, the conceptual rainfall-runoff grey model is thus calibrated and the uncertainty bands obtained both downstream of the calibration process and downstream of the validation process are compared with those obtained by using a well-established approach, like the GLUE approach, for characterizing uncertainty. The results of the comparison show that the proposed approach may represent a valid tool for characterizing the global uncertainty associable with the output of a rainfall-runoff simulation model.
Resumo:
This paper addresses topics - either relevant or confusing or needing more attention - related to measuring the trade and poverty nexus. It sheds a critical light on the existing material and suggests needed research lines. It starts with questions akin to the LAC realities; then, keeping this view, general methodological issues are also examined. In a broader perspective, further ideas for the research agenda are formulated. The main conclusion is that relevant findings still demand considerable efforts. Moreover, the Information-measurement-model-evaluation paradigm is not enough, policy guidelines being usually too general. In LAC, it must be extended and deepened, accounting more for the heterogeneity of cases, including, whenever possible, the physical constraints and incorporating new ways of integrating both the local and global perspectives. Other aspects, like the role of specific juridical measures, should play a role. How all this can be combined into more encompassing evaluations remains open
Resumo:
Na actual conjuntura de crise, é necessária a emergência de um paradigma mais rentável e que analise os fenómenos na sua globalidade, mantendo a coerência do todo. Utilizando o desporto, mais concretamente a capoeira, pretende-se promover a desejada transformação do Homem de forma intencional e eficiente, através da rentabilização da modalidade quer nos seus objectivos imediatos quer mediatos. A generalidade do conhecimento sobre capoeira é baseado e veiculado sob a forma empírica, o que se reflecte no processo de operacionalização da modalidade. Procura-se então, iniciar a ruptura com o empirismo, através da aplicação de ferramentas já testadas, nomeadamente o Modelo de Desportos de Combate da taxonomia de Almada (1994), que permite desdobrar as variáveis básicas de movimento e identificar quais as que podem ser tratadas de forma rentável. Recorrendo a um elemento técnico da capoeira – a meia-lua de compasso –, procurou-se verificar se a análise realizada através do modelo poderia ser útil. Realizaram-se duas situações experimentais, de análise videográfica, que permitem conhecer os limites temporais sobre os quais a acção se pode dar, utilizando para a situação 1 uma amostra de n=13 capoeiristas, e para a situação 2 duas amostras de n1=12 capoeiristas de nível iniciado-intermédio, e n2=8 capoeiristas de nível avançado. Verificou-se, assim, que através da utilização do referido modelo, é possível identificar variáveis fundamentais para a compreensão do fenómeno e de limites a respeitar.
Resumo:
Hebb proposed that synapses between neurons that fire synchronously are strengthened, forming cell assemblies and phase sequences. The former, on a shorter scale, are ensembles of synchronized cells that function transiently as a closed processing system; the latter, on a larger scale, correspond to the sequential activation of cell assemblies able to represent percepts and behaviors. Nowadays, the recording of large neuronal populations allows for the detection of multiple cell assemblies. Within Hebb's theory, the next logical step is the analysis of phase sequences. Here we detected phase sequences as consecutive assembly activation patterns, and then analyzed their graph attributes in relation to behavior. We investigated action potentials recorded from the adult rat hippocampus and neocortex before, during and after novel object exploration (experimental periods). Within assembly graphs, each assembly corresponded to a node, and each edge corresponded to the temporal sequence of consecutive node activations. The sum of all assembly activations was proportional to firing rates, but the activity of individual assemblies was not. Assembly repertoire was stable across experimental periods, suggesting that novel experience does not create new assemblies in the adult rat. Assembly graph attributes, on the other hand, varied significantly across behavioral states and experimental periods, and were separable enough to correctly classify experimental periods (Naïve Bayes classifier; maximum AUROCs ranging from 0.55 to 0.99) and behavioral states (waking, slow wave sleep, and rapid eye movement sleep; maximum AUROCs ranging from 0.64 to 0.98). Our findings agree with Hebb's view that assemblies correspond to primitive building blocks of representation, nearly unchanged in the adult, while phase sequences are labile across behavioral states and change after novel experience. The results are compatible with a role for phase sequences in behavior and cognition.
Resumo:
The work objectified to apprehend the degree of the teachers' concept concerning the territory concept and to intervene with situations of critical reflections to accompany and to analyze the process of conceptual elaboration. It contemplates on the (new)meaning of knowledge and (new)elaboration of the concept in study done inside a pedagogic intervention. The Municipal School Dr. Julio Senna - Ceará-Mirim/RN and six (6) teacher-collaborators that taught in the 3rd and 4th grades of the fundamental teaching, constitutes the empiric field of the research. Its theoretical-methodological contributions are built in the studies of Vigotski (2000a, 2000b and 2001) on the formation process and development of concepts; in the methodology colaborate (Ibiapina (2004), Bartomé (1986), Kemmis and Mctaggart (1988), Arnal, Del Ricon and Latorre (1992), Pepper and Ghedin (2002), among others) and in the critical-reflexive conception of the Geography (Soares Júnior (2000 and 1994), Silva (1998), Raffestin (1993), Santos (1994), Felipe (1998), among others). The accomplishment of the work presupposed starting from the reflections on the following subjects: which the teachers' understanding in the school space concerning the territory concept? How does happen the process of conceptual construction territory for the teachers? The analysis of the teachers' previous knowledge on the concept in study, evidenced that its apprehensions on the attributes of the referred concept went mentioned the to light of the perceptible dimension of the real-concrete relationships of the reality linked to the degree of the spontaneous concepts and followed by the ideas of the traditional, humanistic and cultural geographical conceptions (positivism and phenomenology), restricting the territory meaning the notion of State-Nation and place of the men's dwelling. In the intervention process, it was verified to real possibility of the acquisition of indispensable scientific concepts to the process of (new)meaning conceptual of geographical knowledge through the continuous practice of the educational formation, when it was evidenced that the teacher-collaborators acquired high degrees of attributions of the significance of the territory concept to the they elaborate generalizations by means of analyses and syntheses of the concept-attribute (essential and multiples) of the reference conceptual in study
Resumo:
Block diagrams and signal-flow graphs are used to represent and to obtain the transfer function of interconnected systems. The reduction of signal-flow graphs is considered simpler than the reduction of block diagrams for systems with complex interrelationships. Signal-flow graphs reduction can be made without graphic manipulations of diagrams, and it is attractive for a computational implementation. In this paper the authors propose a computational method for direct reduction of signal-flow graphs. This method uses results presented in this paper about the calculation of literal determinants without symbolic mathematics tools. The Cramer's rule is applied for the solution of a set of linear equations, A program in MATLAB language for reduction of signal-flow graphs with the proposed method is presented.
Resumo:
Information has increasingly become a crucial resource for organizations that want to remain competitive in the market. For this reason, analysis and a correct understanding of informational types that are present in these environments become relevant to achieving the highest levels of performance. The aim of this paper is to review the literature of the concepts of organic and archival information within the organizational context/business environments. This is still an emerging theoretical field and therefore is conducive to intense discussions. We point out elements that help to characterize and distinguish these two types of information.
Resumo:
In this paper, we extend the use of the variance dispersion graph (VDG) to experiments in which the response surface (RS) design must be blocked. Through several examples we evaluate the prediction performances of RS designs in non-orthogonal block designs compared with the equivalent unblocked designs and orthogonally blocked designs. These examples illustrate that good prediction performance of designs in small blocks can be expected in practice. Most importantly, we show that the allocation of the treatment set to blocks can seriously affect the prediction properties of designs; thus, much care is needed in performing this allocation.
Resumo:
Variance dispersion graphs have become a popular tool in aiding the choice of a response surface design. Often differences in response from some particular point, such as the expected position of the optimum or standard operating conditions, are more important than the response itself. We describe two examples from food technology. In the first, an experiment was conducted to find the levels of three factors which optimized the yield of valuable products enzymatically synthesized from sugars and to discover how the yield changed as the levels of the factors were changed from the optimum. In the second example, an experiment was conducted on a mixing process for pastry dough to discover how three factors affected a number of properties of the pastry, with a view to using these factors to control the process. We introduce the difference variance dispersion graph (DVDG) to help in the choice of a design in these circumstances. The DVDG for blocked designs is developed and the examples are used to show how the DVDG can be used in practice. In both examples a design was chosen by using the DVDG, as well as other properties, and the experiments were conducted and produced results that were useful to the experimenters. In both cases the conclusions were drawn partly by comparing responses at different points on the response surface.
Resumo:
This paper explains the conceptual design of instrumentation that measures electric quantities defined in the Std. 1459-2000. It is shown how the instantaneous-space-phasor approach, based on alpha, beta, 0 components, can be used to monitor electric energy flow, evaluate the utilization of transmission line, and quantify the level of harmonic pollution injected by nonlinear loads.
Resumo:
The paper explains the conceptual design of instrumentation that measures electric quantities defined in the trial-use Std. 1459-2000. It is shown how the Instantaneous-Space-Phasor (ISP) approach, based on α, β, 0 components, can be used to monitor electric energy flow, evaluate the utilization of transmission line and quantify the level of harmonic pollution injected by nonlinear loads.