961 resultados para idea-cache model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple model incorporating rent-seeking into the standard neoclassical model of capital accumulation is presented. It embodies the idea that the performance of an economy depends on the efficiency of its institutions. It is shown that welfare is positively affected by the institutional efficiency, although output is not necessarily so. It is also shown that an economy with a monopolistic rent-seeker performs better than one with a competitive rent-seeking industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The following paper was conducted with the support of several entrepreneurs and startups from Brazil. The aim of the research was to find out which impact the Business Model Canvas, further abbreviated as BMC, has on technology-oriented startups in Brazil. The first step of the study was identify some general concepts of entrepreneurship, as well as the conditions and environment of the country. Afterwards, it was focused on defining and comparing different business model tools and concepts to the BMC. After the literature review and meeting with several professionals in the area of entrepreneurship and startups, a questionnaire was formulated in order to conduct the qualitative study and identify the main impact of the tool. The questionnaire was answered by ten startups. In order to check the validity and credibility of the research outcomes, theory and investigator triangulation was used. As a result, the usage of the BMC could be evaluated by obtaining the outcomes and the theory, which showed that Brazilian tech startups are using Osterwalder’s model for the reason of idea creation and testing, validating and pivoting their business model. Interestingly, the research revealed that the entrepreneurs are using the tool often not in the traditional way of printing it, but rather applying it as a thinking approach. Besides, the entrepreneurs are focusing mostly on developing a strong Value Proposition, Customer Segment and sustainable Revenue Streams, while afterwards the remaining building blocks are built. Moreover, the research showed that the startups are using also other concepts, such as the Customer Development Process or Build-Measure-Learn Feedback Loop. These methodologies are often applied together with the BMC and helps to identify the most sustainable components of the business idea. Keywords: Business

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Starting from the idea that economic systems fall into complexity theory, where its many agents interact with each other without a central control and that these interactions are able to change the future behavior of the agents and the entire system, similar to a chaotic system we increase the model of Russo et al. (2014) to carry out three experiments focusing on the interaction between Banks and Firms in an artificial economy. The first experiment is relative to Relationship Banking where, according to the literature, the interaction over time between Banks and Firms are able to produce mutual benefits, mainly due to reduction of the information asymmetry between them. The following experiment is related to information heterogeneity in the credit market, where the larger the bank, the higher their visibility in the credit market, increasing the number of consult for new loans. Finally, the third experiment is about the effects on the credit market of the heterogeneity of prices that Firms faces in the goods market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel strategy to handle divergences typical of perturbative calculations is implemented for the Nambu-Jona-Lasinio model and its phenomenological consequences investigated. The central idea of the method is to avoid the critical step involved in the regularization process, namely, the explicit evaluation of divergent integrals. This goal is achieved by assuming a regularization distribution in an implicit way and making use, in intermediary steps, only of very general properties of such regularization. The finite parts are separated from the divergent ones and integrated free from effects of the regularization. The divergent parts are organized in terms of standard objects, which are independent of the ( arbitrary) momenta running in internal lines of loop graphs. Through the analysis of symmetry relations, a set of properties for the divergent objects are identified, which we denominate consistency relations, reducing the number of divergent objects to only a few. The calculational strategy eliminates unphysical dependencies of the arbitrary choices for the routing of internal momenta, leading to ambiguity-free, and symmetry-preserving physical amplitudes. We show that the imposition of scale properties for the basic divergent objects leads to a critical condition for the constituent quark mass such that the remaining arbitrariness is removed. The model becomes predictive in the sense that its phenomenological consequences do not depend on possible choices made in intermediary steps. Numerical results are obtained for physical quantities at the one-loop level for the pion and sigma masses and pion-quark and sigma-quark coupling constants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel constructive heuristic algorithm to the network expansion planning problem is presented the basic idea comes from Garver's work applied to the transportation model, nevertheless the proposed algorithm is for the DC model. Tests results with most known systems in the literature are carried out to show the efficiency of the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The frequency spectrums are inefficiently utilized and cognitive radio has been proposed for full utilization of these spectrums. The central idea of cognitive radio is to allow the secondary user to use the spectrum concurrently with the primary user with the compulsion of minimum interference. However, designing a model with minimum interference is a challenging task. In this paper, a transmission model based on cyclic generalized polynomial codes discussed in [2] and [15], is proposed for the improvement in utilization of spectrum. The proposed model assures a non interference data transmission of the primary and secondary users. Furthermore, analytical results are presented to show that the proposed model utilizes spectrum more efficiently as compared to traditional models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The enzymatically catalyzed template-directed extension of ssDNA/primer complex is an impor-tant reaction of extraordinary complexity. The DNA polymerase does not merely facilitate the insertion of dNMP, but it also performs rapid screening of substrates to ensure a high degree of fidelity. Several kinetic studies have determined rate constants and equilibrium constants for the elementary steps that make up the overall pathway. The information is used to develop a macro-scopic kinetic model, using an approach described by Ninio [Ninio J., 1987. Alternative to the steady-state method: derivation of reaction rates from first-passage times and pathway probabili-ties. Proc. Natl. Acad. Sci. U.S.A. 84, 663–667]. The principle idea of the Ninio approach is to track a single template/primer complex over time and to identify the expected behavior. The average time to insert a single nucleotide is a weighted sum of several terms, in-cluding the actual time to insert a nucleotide plus delays due to polymerase detachment from ei-ther the ternary (template-primer-polymerase) or quaternary (+nucleotide) complexes and time delays associated with the identification and ultimate rejection of an incorrect nucleotide from the binding site. The passage times of all events and their probability of occurrence are ex-pressed in terms of the rate constants of the elementary steps of the reaction pathway. The model accounts for variations in the average insertion time with different nucleotides as well as the in-fluence of G+C content of the sequence in the vicinity of the insertion site. Furthermore the model provides estimates of error frequencies. If nucleotide extension is recognized as a compe-tition between successful insertions and time delaying events, it can be described as a binomial process with a probability distribution. The distribution gives the probability to extend a primer/template complex with a certain number of base pairs and in general it maps annealed complexes into extension products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we present the idea of how generalized ensembles can be used to simplify the operational study of non-additive physical systems. As alternative of the usual methods of direct integration or mean-field theory, we show how the solution of the Ising model with infinite-range interactions is obtained by using a generalized canonical ensemble. We describe how the thermodynamical properties of this model in the presence of an external magnetic field are founded by simple parametric equations. Without impairing the usual interpretation, we obtain an identical critical behaviour as observed in traditional approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we reported some results about the stochastic quantization of the spherical model. We started by reviewing some basic aspects of this method with emphasis in the connection between the Langevin equation and the supersymmetric quantum mechanics, aiming at the application of the corresponding connection to the spherical model. An intuitive idea is that when applied to the spherical model this gives rise to a supersymmetric version that is identified with one studied in Phys. Rev. E 85, 061109, (2012). Before investigating in detail this aspect, we studied the stochastic quantization of the mean spherical model that is simpler to implement than the one with the strict constraint. We also highlight some points concerning more traditional methods discussed in the literature like canonical and path integral quantization. To produce a supersymmetric version, grounded in the Nicolai map, we investigated the stochastic quantization of the strict spherical model. We showed in fact that the result of this process is an off-shell supersymmetric extension of the quantum spherical model (with the precise supersymmetric constraint structure). That analysis establishes a connection between the classical model and its supersymmetric quantum counterpart. The supersymmetric version in this way constructed is a more natural one and gives further support and motivations to investigate similar connections in other models of the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Programa de doctorado: Sistemas Inteligentes y Aplicaciones Numéricas en Ingeniería Instituto Universitario (SIANI)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes modelling tools and methods suited for complex systems (systems that typically are represented by a plurality of models). The basic idea is that all models representing the system should be linked by well-defined model operations in order to build a structured repository of information, a hierarchy of models. The port-Hamiltonian framework is a good candidate to solve this kind of problems as it supports the most important model operations natively. The thesis in particular addresses the problem of integrating distributed parameter systems in a model hierarchy, and shows two possible mechanisms to do that: a finite-element discretization in port-Hamiltonian form, and a structure-preserving model order reduction for discretized models obtainable from commercial finite-element packages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cost, performance and availability considerations are forcing even the most conservative high-integrity embedded real-time systems industry to migrate from simple hardware processors to ones equipped with caches and other acceleration features. This migration disrupts the practices and solutions that industry had developed and consolidated over the years to perform timing analysis. Industry that are confident with the efficiency/effectiveness of their verification and validation processes for old-generation processors, do not have sufficient insight on the effects of the migration to cache-equipped processors. Caches are perceived as an additional source of complexity, which has potential for shattering the guarantees of cost- and schedule-constrained qualification of their systems. The current industrial approach to timing analysis is ill-equipped to cope with the variability incurred by caches. Conversely, the application of advanced WCET analysis techniques on real-world industrial software, developed without analysability in mind, is hardly feasible. We propose a development approach aimed at minimising the cache jitters, as well as at enabling the application of advanced WCET analysis techniques to industrial systems. Our approach builds on:(i) identification of those software constructs that may impede or complicate timing analysis in industrial-scale systems; (ii) elaboration of practical means, under the model-driven engineering (MDE) paradigm, to enforce the automated generation of software that is analyzable by construction; (iii) implementation of a layout optimisation method to remove cache jitters stemming from the software layout in memory, with the intent of facilitating incremental software development, which is of high strategic interest to industry. The integration of those constituents in a structured approach to timing analysis achieves two interesting properties: the resulting software is analysable from the earliest releases onwards - as opposed to becoming so only when the system is final - and more easily amenable to advanced timing analysis by construction, regardless of the system scale and complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The motivating problem concerns the estimation of the growth curve of solitary corals that follow the nonlinear Von Bertalanffy Growth Function (VBGF). The most common parameterization of the VBGF for corals is based on two parameters: the ultimate length L∞ and the growth rate k. One aim was to find a more reliable method for estimating these parameters, which can capture the influence of environmental covariates. The main issue with current methods is that they force the linearization of VBGF and neglect intra-individual variability. The idea was to use the hierarchical nonlinear model which has the appealing features of taking into account the influence of collection sites, possible intra-site measurement correlation and variance heterogeneity, and that can handle the influence of environmental factors and all the reliable information that might influence coral growth. This method was used on two databases of different solitary corals i.e. Balanophyllia europaea and Leptopsammia pruvoti, collected in six different sites in different environmental conditions, which introduced a decisive improvement in the results. Nevertheless, the theory of the energy balance in growth ascertains the linear correlation of the two parameters and the independence of the ultimate length L∞ from the influence of environmental covariates, so a further aim of the thesis was to propose a new parameterization based on the ultimate length and parameter c which explicitly describes the part of growth ascribable to site-specific conditions such as environmental factors. We explored the possibility of estimating these parameters characterizing the VBGF new parameterization via the nonlinear hierarchical model. Again there was a general improvement with respect to traditional methods. The results of the two parameterizations were similar, although a very slight improvement was observed in the new one. This is, nevertheless, more suitable from a theoretical point of view when considering environmental covariates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The beta-decay of free neutrons is a strongly over-determined process in the Standard Model (SM) of Particle Physics and is described by a multitude of observables. Some of those observables are sensitive to physics beyond the SM. For example, the correlation coefficients of the involved particles belong to them. The spectrometer aSPECT was designed to measure precisely the shape of the proton energy spectrum and to extract from it the electron anti-neutrino angular correlation coefficient "a". A first test period (2005/ 2006) showed the “proof-of-principles”. The limiting influence of uncontrollable background conditions in the spectrometer made it impossible to extract a reliable value for the coefficient "a" (publication: Baessler et al., 2008, Europhys. Journ. A, 38, p.17-26). A second measurement cycle (2007/ 2008) aimed to under-run the relative accuracy of previous experiments (Stratowa et al. (1978), Byrne et al. (2002)) da/a =5%. I performed the analysis of the data taken there which is the emphasis of this doctoral thesis. A central point are background studies. The systematic impact of background on a was reduced to da/a(syst.)=0.61 %. The statistical accuracy of the analyzed measurements is da/a(stat.)=1.4 %. Besides, saturation effects of the detector electronics were investigated which were initially observed. These turned out not to be correctable on a sufficient level. An applicable idea how to avoid the saturation effects will be discussed in the last chapter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Until few years ago, 3D modelling was a topic confined into a professional environment. Nowadays technological innovations, the 3D printer among all, have attracted novice users to this application field. This sudden breakthrough was not supported by adequate software solutions. The 3D editing tools currently available do not assist the non-expert user during the various stages of generation, interaction and manipulation of 3D virtual models. This is mainly due to the current paradigm that is largely supported by two-dimensional input/output devices and strongly affected by obvious geometrical constraints. We have identified three main phases that characterize the creation and management of 3D virtual models. We investigated these directions evaluating and simplifying the classic editing techniques in order to propose more natural and intuitive tools in a pure 3D modelling environment. In particular, we focused on freehand sketch-based modelling to create 3D virtual models, interaction and navigation in a 3D modelling environment and advanced editing tools for free-form deformation and objects composition. To pursuing these goals we wondered how new gesture-based interaction technologies can be successfully employed in a 3D modelling environments, how we could improve the depth perception and the interaction in 3D environments and which operations could be developed to simplify the classical virtual models editing paradigm. Our main aims were to propose a set of solutions with which a common user can realize an idea in a 3D virtual model, drawing in the air just as he would on paper. Moreover, we tried to use gestures and mid-air movements to explore and interact in 3D virtual environment, and we studied simple and effective 3D form transformations. The work was carried out adopting the discrete representation of the models, thanks to its intuitiveness, but especially because it is full of open challenges.