986 resultados para idea-cache model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The magnetization properties of aggregated ferrofluids are calculated by combining the chain formation model developed by Zubarev with the modified mean-field theory. Using moderate assumptions for the inter- and intrachain interactions we obtain expressions for the magnetization and initial susceptibility. When comparing the results of our theory to molecular dynamics simulations of the same model we find that at large dipolar couplings (lambda>3) the chain formation model appears to give better predictions than other analytical approaches. This supports the idea that chain formation is an important structural ingredient of strongly interacting dipolar particles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to present the model of the translation of particularly important ideas for the organization and its context, called mythical ideas. Design/methodology/approach – The study is based on ethnographic research. Findings – It is found that change processes based on mythical ideas are especially dynamic but also very vulnerable. The consequences of failure can be vital for the organization and its environment. Originality/value – The paper explores the outcomes to which the translation of a mythical idea can lead. The findings are of value for people involved in organizational change processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solar plus heat pump systems are often very complex in design, with sometimes special heat pump arrangements and control. Therefore detailed heat pump models can give very slow system simulations and still not so accurate results compared to real heat pump performance in a system. The idea here is to start from a standard measured performance map of test points for a heat pump according to EN 14825 and then determine characteristic parameters for a simplified correlation based model of the heat pump. By plotting heat pump test data in different ways including power input and output form and not only as COP, a simplified relation could be seen. By using the same methodology as in the EN 12975 QDT part in the collector test standard it could be shown that a very simple model could describe the heat pump test data very accurately, by identifying 4 parameters in the correlation equation found. © 2012 The Authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple model incorporating rent-seeking into the standard neoclassical model of capital accumulation is presented. It embodies the idea that the performance of an economy depends on the efficiency of its institutions. It is shown that welfare is positively affected by the institutional efficiency, although output is not necessarily so. It is also shown that an economy with a monopolistic rent-seeker performs better than one with a competitive rent-seeking industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The following paper was conducted with the support of several entrepreneurs and startups from Brazil. The aim of the research was to find out which impact the Business Model Canvas, further abbreviated as BMC, has on technology-oriented startups in Brazil. The first step of the study was identify some general concepts of entrepreneurship, as well as the conditions and environment of the country. Afterwards, it was focused on defining and comparing different business model tools and concepts to the BMC. After the literature review and meeting with several professionals in the area of entrepreneurship and startups, a questionnaire was formulated in order to conduct the qualitative study and identify the main impact of the tool. The questionnaire was answered by ten startups. In order to check the validity and credibility of the research outcomes, theory and investigator triangulation was used. As a result, the usage of the BMC could be evaluated by obtaining the outcomes and the theory, which showed that Brazilian tech startups are using Osterwalder’s model for the reason of idea creation and testing, validating and pivoting their business model. Interestingly, the research revealed that the entrepreneurs are using the tool often not in the traditional way of printing it, but rather applying it as a thinking approach. Besides, the entrepreneurs are focusing mostly on developing a strong Value Proposition, Customer Segment and sustainable Revenue Streams, while afterwards the remaining building blocks are built. Moreover, the research showed that the startups are using also other concepts, such as the Customer Development Process or Build-Measure-Learn Feedback Loop. These methodologies are often applied together with the BMC and helps to identify the most sustainable components of the business idea. Keywords: Business

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Starting from the idea that economic systems fall into complexity theory, where its many agents interact with each other without a central control and that these interactions are able to change the future behavior of the agents and the entire system, similar to a chaotic system we increase the model of Russo et al. (2014) to carry out three experiments focusing on the interaction between Banks and Firms in an artificial economy. The first experiment is relative to Relationship Banking where, according to the literature, the interaction over time between Banks and Firms are able to produce mutual benefits, mainly due to reduction of the information asymmetry between them. The following experiment is related to information heterogeneity in the credit market, where the larger the bank, the higher their visibility in the credit market, increasing the number of consult for new loans. Finally, the third experiment is about the effects on the credit market of the heterogeneity of prices that Firms faces in the goods market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel strategy to handle divergences typical of perturbative calculations is implemented for the Nambu-Jona-Lasinio model and its phenomenological consequences investigated. The central idea of the method is to avoid the critical step involved in the regularization process, namely, the explicit evaluation of divergent integrals. This goal is achieved by assuming a regularization distribution in an implicit way and making use, in intermediary steps, only of very general properties of such regularization. The finite parts are separated from the divergent ones and integrated free from effects of the regularization. The divergent parts are organized in terms of standard objects, which are independent of the ( arbitrary) momenta running in internal lines of loop graphs. Through the analysis of symmetry relations, a set of properties for the divergent objects are identified, which we denominate consistency relations, reducing the number of divergent objects to only a few. The calculational strategy eliminates unphysical dependencies of the arbitrary choices for the routing of internal momenta, leading to ambiguity-free, and symmetry-preserving physical amplitudes. We show that the imposition of scale properties for the basic divergent objects leads to a critical condition for the constituent quark mass such that the remaining arbitrariness is removed. The model becomes predictive in the sense that its phenomenological consequences do not depend on possible choices made in intermediary steps. Numerical results are obtained for physical quantities at the one-loop level for the pion and sigma masses and pion-quark and sigma-quark coupling constants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel constructive heuristic algorithm to the network expansion planning problem is presented the basic idea comes from Garver's work applied to the transportation model, nevertheless the proposed algorithm is for the DC model. Tests results with most known systems in the literature are carried out to show the efficiency of the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The frequency spectrums are inefficiently utilized and cognitive radio has been proposed for full utilization of these spectrums. The central idea of cognitive radio is to allow the secondary user to use the spectrum concurrently with the primary user with the compulsion of minimum interference. However, designing a model with minimum interference is a challenging task. In this paper, a transmission model based on cyclic generalized polynomial codes discussed in [2] and [15], is proposed for the improvement in utilization of spectrum. The proposed model assures a non interference data transmission of the primary and secondary users. Furthermore, analytical results are presented to show that the proposed model utilizes spectrum more efficiently as compared to traditional models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The enzymatically catalyzed template-directed extension of ssDNA/primer complex is an impor-tant reaction of extraordinary complexity. The DNA polymerase does not merely facilitate the insertion of dNMP, but it also performs rapid screening of substrates to ensure a high degree of fidelity. Several kinetic studies have determined rate constants and equilibrium constants for the elementary steps that make up the overall pathway. The information is used to develop a macro-scopic kinetic model, using an approach described by Ninio [Ninio J., 1987. Alternative to the steady-state method: derivation of reaction rates from first-passage times and pathway probabili-ties. Proc. Natl. Acad. Sci. U.S.A. 84, 663–667]. The principle idea of the Ninio approach is to track a single template/primer complex over time and to identify the expected behavior. The average time to insert a single nucleotide is a weighted sum of several terms, in-cluding the actual time to insert a nucleotide plus delays due to polymerase detachment from ei-ther the ternary (template-primer-polymerase) or quaternary (+nucleotide) complexes and time delays associated with the identification and ultimate rejection of an incorrect nucleotide from the binding site. The passage times of all events and their probability of occurrence are ex-pressed in terms of the rate constants of the elementary steps of the reaction pathway. The model accounts for variations in the average insertion time with different nucleotides as well as the in-fluence of G+C content of the sequence in the vicinity of the insertion site. Furthermore the model provides estimates of error frequencies. If nucleotide extension is recognized as a compe-tition between successful insertions and time delaying events, it can be described as a binomial process with a probability distribution. The distribution gives the probability to extend a primer/template complex with a certain number of base pairs and in general it maps annealed complexes into extension products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we present the idea of how generalized ensembles can be used to simplify the operational study of non-additive physical systems. As alternative of the usual methods of direct integration or mean-field theory, we show how the solution of the Ising model with infinite-range interactions is obtained by using a generalized canonical ensemble. We describe how the thermodynamical properties of this model in the presence of an external magnetic field are founded by simple parametric equations. Without impairing the usual interpretation, we obtain an identical critical behaviour as observed in traditional approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we reported some results about the stochastic quantization of the spherical model. We started by reviewing some basic aspects of this method with emphasis in the connection between the Langevin equation and the supersymmetric quantum mechanics, aiming at the application of the corresponding connection to the spherical model. An intuitive idea is that when applied to the spherical model this gives rise to a supersymmetric version that is identified with one studied in Phys. Rev. E 85, 061109, (2012). Before investigating in detail this aspect, we studied the stochastic quantization of the mean spherical model that is simpler to implement than the one with the strict constraint. We also highlight some points concerning more traditional methods discussed in the literature like canonical and path integral quantization. To produce a supersymmetric version, grounded in the Nicolai map, we investigated the stochastic quantization of the strict spherical model. We showed in fact that the result of this process is an off-shell supersymmetric extension of the quantum spherical model (with the precise supersymmetric constraint structure). That analysis establishes a connection between the classical model and its supersymmetric quantum counterpart. The supersymmetric version in this way constructed is a more natural one and gives further support and motivations to investigate similar connections in other models of the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Programa de doctorado: Sistemas Inteligentes y Aplicaciones Numéricas en Ingeniería Instituto Universitario (SIANI)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes modelling tools and methods suited for complex systems (systems that typically are represented by a plurality of models). The basic idea is that all models representing the system should be linked by well-defined model operations in order to build a structured repository of information, a hierarchy of models. The port-Hamiltonian framework is a good candidate to solve this kind of problems as it supports the most important model operations natively. The thesis in particular addresses the problem of integrating distributed parameter systems in a model hierarchy, and shows two possible mechanisms to do that: a finite-element discretization in port-Hamiltonian form, and a structure-preserving model order reduction for discretized models obtainable from commercial finite-element packages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cost, performance and availability considerations are forcing even the most conservative high-integrity embedded real-time systems industry to migrate from simple hardware processors to ones equipped with caches and other acceleration features. This migration disrupts the practices and solutions that industry had developed and consolidated over the years to perform timing analysis. Industry that are confident with the efficiency/effectiveness of their verification and validation processes for old-generation processors, do not have sufficient insight on the effects of the migration to cache-equipped processors. Caches are perceived as an additional source of complexity, which has potential for shattering the guarantees of cost- and schedule-constrained qualification of their systems. The current industrial approach to timing analysis is ill-equipped to cope with the variability incurred by caches. Conversely, the application of advanced WCET analysis techniques on real-world industrial software, developed without analysability in mind, is hardly feasible. We propose a development approach aimed at minimising the cache jitters, as well as at enabling the application of advanced WCET analysis techniques to industrial systems. Our approach builds on:(i) identification of those software constructs that may impede or complicate timing analysis in industrial-scale systems; (ii) elaboration of practical means, under the model-driven engineering (MDE) paradigm, to enforce the automated generation of software that is analyzable by construction; (iii) implementation of a layout optimisation method to remove cache jitters stemming from the software layout in memory, with the intent of facilitating incremental software development, which is of high strategic interest to industry. The integration of those constituents in a structured approach to timing analysis achieves two interesting properties: the resulting software is analysable from the earliest releases onwards - as opposed to becoming so only when the system is final - and more easily amenable to advanced timing analysis by construction, regardless of the system scale and complexity.