958 resultados para subset consistency


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Non-linear methods for estimating variability in time-series are currently of widespread use. Among such methods are approximate entropy (ApEn) and sample approximate entropy (SampEn). The applicability of ApEn and SampEn in analyzing data is evident and their use is increasing. However, consistency is a point of concern in these tools, i.e., the classification of the temporal organization of a data set might indicate a relative less ordered series in relation to another when the opposite is true. As highlighted by their proponents themselves, ApEn and SampEn might present incorrect results due to this lack of consistency. In this study, we present a method which gains consistency by using ApEn repeatedly in a wide range of combinations of window lengths and matching error tolerance. The tool is called volumetric approximate entropy, vApEn. We analyze nine artificially generated prototypical time-series with different degrees of temporal order (combinations of sine waves, logistic maps with different control parameter values, random noises). While ApEn/SampEn clearly fail to consistently identify the temporal order of the sequences, vApEn correctly do. In order to validate the tool we performed shuffled and surrogate data analysis. Statistical analysis confirmed the consistency of the method. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The growth of online, hybrid, and distance courses challenges institutions to maintain content consistency across multiple platforms. This report examines the policies, standards, and practices that guide course consistency initiatives.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is about economies with a representative consumer. In general a representative consumer need not exist, although there are several well known sets of sufficient conditions under which Qne will. It is common practice, however, to use the representative consumer hypothesis without specifically assuming any of these. We show, firstly, that it is possible for the utility of the representative consumer to increase when every actual consumer is made worse off. This shows a serious shortcoming of welfare judgements based on the representatíve consumer. Secondly, in economies where this does not occur, there exists a social welfare function, which we construct, which is consistent with welfare judgements based on the utility of the representative consumer. Finally we provide a converse to Samuelson' s 1956 representative consumer result, which relates it to Scitovsky's community indifference curves.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates which properties money-demand functions have to satisfy to be consistent with multidimensional extensions of Lucasí(2000) versions of the Sidrauski (1967) and the shopping-time models. We also investigate how such classes of models relate to each other regarding the rationalization of money demands. We conclude that money demand functions rationalizable by the shoppingtime model are always rationalizable by the Sidrauski model, but that the converse is not true. The log-log money demand with an interest-rate elasticity greater than or equal to one and the semi-log money demand are counterexamples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of time consistency of the Ramsey monetary and fiscal policies in an economy without capital. Following Lucas and Stokey (1983) we allow the government at date t to leave its successor at t + 1 a profile of real and nominal debt of all maturities, as a way to influence its decisions. We show that the Ramsey policies are time consistent if and only if the Friedman rule is the optimal Ramsey policy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este estudo investiga o poder preditivo fora da amostra, um mês à frente, de um modelo baseado na regra de Taylor para previsão de taxas de câmbio. Revisamos trabalhos relevantes que concluem que modelos macroeconômicos podem explicar a taxa de câmbio de curto prazo. Também apresentamos estudos que são céticos em relação à capacidade de variáveis macroeconômicas preverem as variações cambiais. Para contribuir com o tema, este trabalho apresenta sua própria evidência através da implementação do modelo que demonstrou o melhor resultado preditivo descrito por Molodtsova e Papell (2009), o “symmetric Taylor rule model with heterogeneous coefficients, smoothing, and a constant”. Para isso, utilizamos uma amostra de 14 moedas em relação ao dólar norte-americano que permitiu a geração de previsões mensais fora da amostra de janeiro de 2000 até março de 2014. Assim como o critério adotado por Galimberti e Moura (2012), focamos em países que adotaram o regime de câmbio flutuante e metas de inflação, porém escolhemos moedas de países desenvolvidos e em desenvolvimento. Os resultados da nossa pesquisa corroboram o estudo de Rogoff e Stavrakeva (2008), ao constatar que a conclusão da previsibilidade da taxa de câmbio depende do teste estatístico adotado, sendo necessária a adoção de testes robustos e rigorosos para adequada avaliação do modelo. Após constatar não ser possível afirmar que o modelo implementado provém previsões mais precisas do que as de um passeio aleatório, avaliamos se, pelo menos, o modelo é capaz de gerar previsões “racionais”, ou “consistentes”. Para isso, usamos o arcabouço teórico e instrumental definido e implementado por Cheung e Chinn (1998) e concluímos que as previsões oriundas do modelo de regra de Taylor são “inconsistentes”. Finalmente, realizamos testes de causalidade de Granger com o intuito de verificar se os valores defasados dos retornos previstos pelo modelo estrutural explicam os valores contemporâneos observados. Apuramos que o modelo fundamental é incapaz de antecipar os retornos realizados.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is a study on the population dynamics of blowflies employing a density-dependent. non-linear mathematical model and a coupled population formalism. In this Study, we investigated the coupled population dynamics applying fuzzy subsets to model the Population trajectory. analyzing demographic parameters such as fecundity, Survival, and migration. The main results suggest different possibilities in terms of dynamic behavior produced by migration in coupled Populations between distinct environments and the rescue effect generated by the connection between populations. It was possible to conclude that environmental heterogeneity can play an important role in blowfly metapopulation systems. The implications of these results for population dynamics of blowflies are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We derive the torsion constraints and show the consistency of equations of motion of four-dimensional Type II supergravity in superspace. with Type II sigma model. This is achieved by coupling the four-dimensional compactified Type II Berkovits' superstring to an N = 2 curved background and requiring that the sigma-model has superconformal invariance at tree-level. We compute this in a manifestly 4D N = 2 supersymmetric way. The constraints break the target conformal and SU(2) invariances and the dilaton will be a conformal, SU(2) x U(1) compensator. For Type II superstring in four dimensions, worldsheet supersymmetry requires two different compensators. One type is described by chiral and anti-chiral superfields. This compensator can be identified with a vector multiplet. The other Type II compensator is described by twist-chiral and twist-anti-chiral superfields and can be identified with a tensor hypermultiplet. Also, the superconformal invariance at tree-level selects a particular gauge, where the matter is fixed, but not the compensators. After imposing the reality conditions, we show that the Type II sigma model at tree-level is consistent with the equations of motion for Type II supergravity in the string gauge. (C) 2003 Elsevier B.V All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1/N(c) expansion in QCD (with N(c) the number of colors) suggests using a potential from meson sector (e.g., Richardson) for baryons. For light quarks a sigma-field has to be introduced to ensure chiral symmetry breaking (chi-SB). It is found that nuclear matter properties can be used to pin down the chi-SB modeling. All masses, M(N), m-sigma, m-omega, are found to scale with density. The equations are solved self-consistently.