133 resultados para Process Error
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
We study the minimum mean square error (MMSE) and the multiuser efficiency η of large dynamic multiple access communication systems in which optimal multiuser detection is performed at the receiver as the number and the identities of active users is allowed to change at each transmission time. The system dynamics are ruled by a Markov model describing the evolution of the channel occupancy and a large-system analysis is performed when the number of observations grow large. Starting on the equivalent scalar channel and the fixed-point equation tying multiuser efficiency and MMSE, we extend it to the case of a dynamic channel, and derive lower and upper bounds for the MMSE (and, thus, for η as well) holding true in the limit of large signal–to–noise ratios and increasingly large observation time T.
Resumo:
In this paper we propose a method for computing JPEG quantization matrices for a given mean square error or PSNR. Then, we employ our method to compute JPEG standard progressive operation mode definition scripts using a quantization approach. Therefore, it is no longer necessary to use a trial and error procedure to obtain a desired PSNR and/or definition script, reducing cost. Firstly, we establish a relationship between a Laplacian source and its uniform quantization error. We apply this model to the coefficients obtained in the discrete cosine transform stage of the JPEG standard. Then, an image may be compressed using the JPEG standard under a global MSE (or PSNR) constraint and a set of local constraints determined by the JPEG standard and visual criteria. Secondly, we study the JPEG standard progressive operation mode from a quantization based approach. A relationship between the measured image quality at a given stage of the coding process and a quantization matrix is found. Thus, the definition script construction problem can be reduced to a quantization problem. Simulations show that our method generates better quantization matrices than the classical method based on scaling the JPEG default quantization matrix. The estimation of PSNR has usually an error smaller than 1 dB. This figure decreases for high PSNR values. Definition scripts may be generated avoiding an excessive number of stages and removing small stages that do not contribute during the decoding process with a noticeable image quality improvement.
Resumo:
Rapid manufacturing is an advanced manufacturing technology based on layer-by-layer manufacturing to produce a part. This paper presents experimental work carried out to investigate the effects of scan speed, layer thickness, and building direction on the following part features: dimensional error, surface roughness, and mechanical properties for DMLS with DS H20 powder and SLM with CL 20 powder (1.4404/AISI 316L). Findings were evaluated using ANOVA analysis. According to the experimental results, build direction has a significant effect on part quality, in terms of dimensional error and surface roughness. For the SLM process, the build direction has no influence on mechanical properties. Results of this research support industry estimating part quality and mechanical properties before the production of parts with additive manufacturing, using iron-based powders
Resumo:
This comment corrects the errors in the estimation process that appear in Martins (2001). The first error is in the parametric probit estimation, as the previously presented results do not maximize the log-likelihood function. In the global maximum more variables become significant. As for the semiparametric estimation method, the kernel function used in Martins (2001) can take on both positive and negative values, which implies that the participation probability estimates may be outside the interval [0,1]. We have solved the problem by applying local smoothing in the kernel estimation, as suggested by Klein and Spady (1993).
Resumo:
In this paper we explore the effect of bounded rationality on the convergence of individual behavior toward equilibrium. In the context of a Cournot game with a unique and symmetric Nash equilibrium, firms are modeled as adaptive economic agents through a genetic algorithm. Computational experiments show that (1) there is remarkable heterogeneity across identical but boundedly rational agents; (2) such individual heterogeneity is not simply a consequence of the random elements contained in the genetic algorithm; (3) the more rational agents are in terms of memory abilities and pre-play evaluation of strategies, the less heterogeneous they are in their actions. At the limit case of full rationality, the outcome converges to the standard result of uniform individual behavior.
Resumo:
According to the account of the European Union (EU) decision making proposed in this paper, this is a bargaining process during which actors shift their policy positions with a view to reaching agreements on controversial issues.
Resumo:
This study focuses on identification and exploitation processes among Finnish design entrepreneurs (i.e. selfemployed industrial designers). More specifically, this study strives to find out what design entrepreneurs do when they create new ventures, how venture ideas are identified and how entrepreneurial processes are organized to identify and exploit such venture ideas in the given industrial context. Indeed, what does educated and creative individuals do when they decide to create new ventures, where do the venture ideas originally come from, and moreover, how are venture ideas identified and developed into viable business concepts that are introduced on the markets? From an academic perspective: there is a need to increase our understanding of the interaction between the identification and exploitation of emerging ventures, in this and other empirical contexts. Rather than assuming that venture ideas are constant in time, this study examines how emerging ideas are adjusted to enable exploitation in dynamic market settings. It builds on the insights from previous entrepreneurship process research. The interpretations from the theoretical discussion build on the assumption that the subprocesses of identification and exploitation interact, and moreover, they are closely entwined with each other (e.g. McKelvie & Wiklund, 2004, Davidsson, 2005). This explanation challenges the common assumption that entrepreneurs would first identify venture ideas and then exploit them (e.g. Shane, 2003). The assumption is that exploitation influences identification, just as identification influences exploitation. Based on interviews with design entrepreneurs and external actors (e.g. potential customers, suppliers and collaborators), it appears as identification and exploitation of venture ideas are carried out in close interaction between a number of actors, rather than alone by entrepreneurs. Due to their available resources, design entrepreneurs have a desire to focus on identification related activities and to find external actors that take care of exploitation related activities. The involvement of external actors may have a direct impact on decisionmaking and various activities along the processes of identification and exploitation, which is something that previous research does not particularly emphasize. For instance, Bhave (1994) suggests both operative and strategic feedback from the market, but does not explain how external parties are actually involved in the decisionmaking, and in carrying out various activities along the entrepreneurial process.
Resumo:
This paper has three objectives. First, it aims at revealing the logic of interest rate setting pursued by monetary authorities of 12 new EU members. Using estimation of an augmented Taylor rule, we find that this setting was not always consistent with the official monetary policy. Second, we seek to shed light on the inflation process of these countries. To this end, we carry out an estimation of an open economy Philips curve (PC). Our main finding is that inflation rates were not only driven by backward persistency but also held a forward-looking component. Finally, we assess the viability of existing monetary arrangements for price stability. The analysis of the conditional inflation variance obtained from GARCH estimation of PC is used for this purpose. We conclude that inflation targeting is preferable to an exchange rate peg because it allowed decreasing the inflation rate and anchored its volatility.
Resumo:
El desenvolupament de sistemes d’assistència a la conducció (ADAS) és, avui dia, una de les àrees de recerca de més interès pel Centre de Visió per Computador. A partir de la informació adquirida per sensors instal·lats en un vehicle, els ADAS assisteixen al conductor per tal d’evitar situacions de perill. La validació d’aquests sistemes però, requereix de l’obtenció "manual" de les dades que defineixen l’entorn de conducció de forma precisa: una tasca costosa i subjecta a l’error humà. Per tal de resoldre aquest problema, en aquest projecte s’ha implementat IOCS, un simulador de conducció creat a partir d’un de robots, capaç de crear entorns realistes de conducció i d’obtenir, simultàniament, les dades sobre l’entorn inferides per un ADAS i les que el descriuen objectivament. Aquesta funcionalitat facilita extremadament el procés de validació actual dels sistemes d’assistència a la conducció.
Resumo:
ISAFRUIT is an integrated European Union Project focussed on increasing fruit consumption as a means to improve human health, through evaluating the fruit chain and addressing bottlenecks therein.The innovations which are being developed throughout the ISAFRUIT Project have been analysed to determine both the success factors and the obstacles in reaching the commercialisation stage. Only 9.58% of the deliverables planned within the Project were focussed on developing technological innovations.There is evidence, however, of successes in the development of new innovations arising from the ISAFRUIT Project, with several other potential innovations in the pipeline. Of the technologies identified, 67% are still at the “invention stage”; that is, the stage prior to bridging the so-called “valley of death”, the stage between an invention and an innovation. Those which are considered to have moved over the “valley of death” either had industry partners included in the Project, or had consulted with industry to ensure that the technology was relevant, or met a recognised industry need. Many of the technologies which made less progress did not have the same interactions with industry. A number of other issues were identified which prevented further progress towards innovation. The need for scientists to publish scientific papers, both for their career pathways and to increase their chances of future funding, was identified as one issue, although the filing of patents is now becoming more accepted and recognised. The patenting system is considered complex by many scientists and is not well-understood. Finally, agreements between partners on the sharing of intellectual property rights can cause a delay in the innovation process.
Resumo:
When using a polynomial approximating function the most contentious aspect of the Heat Balance Integral Method is the choice of power of the highest order term. In this paper we employ a method recently developed for thermal problems, where the exponent is determined during the solution process, to analyse Stefan problems. This is achieved by minimising an error function. The solution requires no knowledge of an exact solution and generally produces significantly better results than all previous HBI models. The method is illustrated by first applying it to standard thermal problems. A Stefan problem with an analytical solution is then discussed and results compared to the approximate solution. An ablation problem is also analysed and results compared against a numerical solution. In both examples the agreement is excellent. A Stefan problem where the boundary temperature increases exponentially is analysed. This highlights the difficulties that can be encountered with a time dependent boundary condition. Finally, melting with a time-dependent flux is briefly analysed without applying analytical or numerical results to assess the accuracy.
Resumo:
While much of the literature on immigrants' assimilation has focused on countries with a large tradition of receiving immigrants and with flexible labor markets, very little is known on how immigrants adjust to other types of host economies. With its severe dual labor market, and an unprecedented immigration boom, Spain presents a quite unique experience to analyze immigrations' assimilation process. Using data from the 2000 to 2008 Labor Force Survey, we find that immigrants are more occupationally mobile than natives, and that much of this greater flexibility is explained by immigrants' assimilation process soon after arrival. However, we find little evidence of convergence, especially among women and high skilled immigrants. This suggests that instead of integrating, immigrants occupationally segregate, providing evidence consistent with both imperfect substitutability and immigrants' human capital being under-valued. Additional evidence on the assimilation of earnings and the incidence of permanent employment by different skill levels also supports the hypothesis of segmented labor markets.