993 resultados para code analysis
Resumo:
This paper analyzes the convergence behavior of the least mean square (LMS) filter when used in an adaptive code division multiple access (CDMA) detector consisting of a tapped delay line with adjustable tap weights. The sampling rate may be equal to or higher than the chip rate, and these correspond to chip-spaced (CS) and fractionally spaced (FS) detection, respectively. It is shown that CS and FS detectors with the same time-span exhibit identical convergence behavior if the baseband received signal is strictly bandlimited to half the chip rate. Even in the practical case when this condition is not met, deviations from this observation are imperceptible unless the initial tap-weight vector gives an extremely large mean squared error (MSE). This phenomenon is carefully explained with reference to the eigenvalues of the correlation matrix when the input signal is not perfectly bandlimited. The inadequacy of the eigenvalue spread of the tap-input correlation matrix as an indicator of the transient behavior and the influence of the initial tap weight vector on convergence speed are highlighted. Specifically, a initialization within the signal subspace or to the origin leads to very much faster convergence compared with initialization in the a noise subspace.
Resumo:
Adaptive filters used in code division multiple access (CDMA) receivers to counter interference have been formulated both with and without the assumption of training symbols being transmitted. They are known as training-based and blind detectors respectively. We show that the convergence behaviour of the blind minimum-output-energy (MOE) detector can be quite easily derived, unlike what was implied by the procedure outlined in a previous paper. The simplification results from the observation that the correlation matrix determining convergence performance can be made symmetric, after which many standard results from the literature on least mean square (LMS) filters apply immediately.
Resumo:
Objective: To investigate the sociodemographic determinants of diet quality of the elderly in four EU countries. Design: Cross-sectional study. For each country, a regression was performed of a multidimensional index of dietary quality v. sociodemographic variables. Setting In Finland, Finnish Household Budget Survey (1998 and 2006); in Sweden, SNAC-K (2001–2004); in the UK, Expenditure & Food Survey (2006–07); in Italy, Multi-purpose Survey of Daily Life (2009). Subjects: One- and two-person households of over-50s (Finland, n 2994; UK, n 4749); over-50 s living alone or in two-person households (Italy, n 7564); over-60 s (Sweden, n 2023). Results: Diet quality among the EU elderly is both low on average and heterogeneous across individuals. The regression models explained a small but significant part of the observed heterogeneity in diet quality. Resource availability was associated with diet quality either negatively (Finland and UK) or in a non-linear or non-statistically significant manner (Italy and Sweden), as was the preference for food parameter. Education, not living alone and female gender were characteristics positively associated with diet quality with consistency across the four countries, unlike socio-professional status, age and seasonality. Regional differences within countries persisted even after controlling for the other sociodemographic variables. Conclusions: Poor dietary choices among the EU elderly were not caused by insufficient resources and informational measures could be successful in promoting healthy eating for healthy ageing. On the other hand, food habits appeared largely set in the latter part of life, with age and retirement having little influence on the healthiness of dietary choices.
Resumo:
The present paper explores, theoretically, and empirically, whether compliance with the International Code of marketing of breast-milk substitutes impacts on financial performance measured by stock markets. The empirical analysis, which considers a 20-year period, shows that stock markets are indifferent to the level of compliance by manufacturers with the International Code. Two important issues emerge from this result. Based on our finding that financial performance as measured by stock markets cannot explain the level of compliance, the first issue refers to what alternative types of mechanisms drive manufacturers who comply the least with voluntary codes such as the International Code. Conversely, from our finding that stock markets do not reward the most compliant, the second issue raised is an inherent weakness of stock markets to fully incorporate social and environmental values.
Resumo:
This paper explores the linguistic practice of digital code plays in an online discussion forum, used by the community of English-speaking Germans living in Britain. By adopting a qualitative approach of Computer-Mediated Discourse Analysis, the article examines the ways in which these bilinguals deploy linguistic and other semiotic resources on the forum to co-construct humorous code plays. These performances occur in the context of negotiating language norms and are based on conscious manipulations of both codes, English and German. They involve play with codes at three levels: play with forms, meanings, and frames. Although, at first sight, such alternations appear to be used mainly for a comic effect, there is more to this than just humour. By mixing both codes at all levels, the participants deliberately produce aberrant German ‘polluted’ with English and, in so doing, dismantle the ideology of language purity upheld by the purist movement. The deliberate character of this type of code alternation demonstrates heightened metalinguistic awareness as well as creativity and criticality. By exploring the practice of digital code plays, the current study contributes to the growing body of research on networked multilingualism as well as to practices associated with translanguaging, poly- and metrolingualism.
Resumo:
Aspect-oriented programming (AOP) is a promising technology that supports separation of crosscutting concerns (i.e., functionality that tends to be tangled with, and scattered through the rest of the system). In AOP, a method-like construct named advice is applied to join points in the system through a special construct named pointcut. This mechanism supports the modularization of crosscutting behavior; however, since the added interactions are not explicit in the source code, it is hard to ensure their correctness. To tackle this problem, this paper presents a rigorous coverage analysis approach to ensure exercising the logic of each advice - statements, branches, and def-use pairs - at each affected join point. To make this analysis possible, a structural model based on Java bytecode - called PointCut-based Del-Use Graph (PCDU) - is proposed, along with three integration testing criteria. Theoretical, empirical, and exploratory studies involving 12 aspect-oriented programs and several fault examples present evidence of the feasibility and effectiveness of the proposed approach. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
MCNP has stood so far as one of the main Monte Carlo radiation transport codes. Its use, as any other Monte Carlo based code, has increased as computers perform calculations faster and become more affordable along time. However, the use of Monte Carlo method to tally events in volumes which represent a small fraction of the whole system may turn to be unfeasible, if a straight analogue transport procedure (no use of variance reduction techniques) is employed and precise results are demanded. Calculations of reaction rates in activation foils placed in critical systems turn to be one of the mentioned cases. The present work takes advantage of the fixed source representation from MCNP to perform the above mentioned task in a more effective sampling way (characterizing neutron population in the vicinity of the tallying region and using it in a geometric reduced coupled simulation). An extended analysis of source dependent parameters is studied in order to understand their influence on simulation performance and on validity of results. Although discrepant results have been observed for small enveloping regions, the procedure presents itself as very efficient, giving adequate and precise results in shorter times than the standard analogue procedure. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
The advent of the Auger Engineering Radio Array (AERA) necessitates the development of a powerful framework for the analysis of radio measurements of cosmic ray air showers. As AERA performs ""radio-hybrid"" measurements of air shower radio emission in coincidence with the surface particle detectors and fluorescence telescopes of the Pierre Auger Observatory, the radio analysis functionality had to be incorporated in the existing hybrid analysis solutions for fluorescence and surface detector data. This goal has been achieved in a natural way by extending the existing Auger Offline software framework with radio functionality. In this article, we lay out the design, highlights and features of the radio extension implemented in the Auger Offline framework. Its functionality has achieved a high degree of sophistication and offers advanced features such as vectorial reconstruction of the electric field, advanced signal processing algorithms, a transparent and efficient handling of FFTs, a very detailed simulation of detector effects, and the read-in of multiple data formats including data from various radio simulation codes. The source code of this radio functionality can be made available to interested parties on request. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
In all applications of clone detection it is important to have precise and efficient clone identification algorithms. This paper proposes and outlines a new algorithm, KClone for clone detection that incorporates a novel combination of lexical and local dependence analysis to achieve precision, while retaining speed. The paper also reports on the initial results of a case study using an implementation of KClone with which we have been experimenting. The results indi- cate the ability of KClone to find types-1,2, and 3 clones compared to token-based and PDG-based techniques. The paper also reports results of an initial empirical study of the performance of KClone compared to CCFinderX.
Resumo:
O objeto deste trabalho é a compreensão do financiamento de empresas em crise, mais especificamente, o financiamento concedido após o pedido de recuperação judicial, como forma de permitir que a empresa saia da situação de crise e retorne à condição de normalidade. Para tanto, nos apropriando do termo cunhado pela doutrina norte-americana, para fazer referência ao aporte de recursos em empresas em dificuldade, utilizaremos o termo DIP financing ou financiamento DIP. Para uma compreensão adequada do objeto, é necessário que entendamos a origem do DIP financing nos Estados Unidos e como é a regulação norte-americana sobre a matéria atualmente. O segundo passo será avaliar a possibilidade de aplicação da mesma estrutura de aporte de recursos no Brasil. Ao estudarmos a origem desse mecanismo nos Estados Unidos, veremos os problemas que surgiram ao longo dos anos e como foram superados jurisprudencialmente e doutrinariamente para que o financiamento DIP se consolidasse como uma das formas de aporte de capital em empresas em crise, culminando no desenvolvimento de uma verdadeira indústria de crédito às empresas em dificuldade. Uma análise dos problemas enfrentados pelo sistema falimentar americano nos levará a hipótese de que, a menos que sejam afirmados mecanismos que assegurem a quem concede o financiamento após o pedido de recuperação judicial, uma super prioridade no recebimento após a recuperação judicial, será possível o desenvolvimento de um mercado de DIP financing no Brasil.
Resumo:
In this work, are discussed two formulations of the boundary element method - BEM to perform linear bending analysis of plates reinforced by beams. Both formulations are based on the Kirchhoffs hypothesis and they are obtained from the reciprocity theorem applied to zoned plates, where each sub-region defines a beam or a stab. In the first model the problem values are defined along the interfaces and the external boundary. Then, in order to reduce the number of degrees of freedom kinematics hypothesis are assumed along the beam cross section, leading to a second formulation where the collocation points are defined along the beam skeleton, instead of being placed on interfaces. on these formulations no approximation of the generalized forces along the interface is required. Moreover, compatibility and equilibrium conditions along the interface are automatically imposed by the integral equation. Thus, these formulations require less approximation and the total number of the degrees of freedom is reduced. In the numerical examples are discussed the differences between these two BEM formulations, comparing as well the results to a well-known finite element code.
Resumo:
This article reports a theoretical study based on experimental results for barium zirconate, BaZrO3 (BZ) thin films, using periodic mechanic quantum calculations to analyze the symmetry change in a structural order-disorder simulation. Four periodic models were simulated using CRYSTAL98 code to represent the ordered and disordered BZ structures. The results were analyzed in terms of the energy level diagrams and atomic orbital distributions to explain and understand the BZ photoluminescence properties (PL) at room temperature for the disordered structure based on structural deformation and symmetry changes. (C) 2009 Wiley Periodicals, Inc. Int J Quantum Chem 111: 694-701, 2011
Resumo:
In this work, the plate bending formulation of the boundary element method (BEM), based on the Reissner's hypothesis, is extended to the analysis of plates reinforced by rectangular beams. This composed structure is modelled by a zoned plate, being the beams represented by narrow sub-regions with larger thickness. The integral equations are derived by applying the weighted residual method to each sub-region, and summing them to get the equation for the whole plate. Equilibrium and compatibility conditions are automatically imposed by the integral equations, which treat this composed structure as a single body. In order to decrease the number of degrees of freedom, some approximations are considered for both displacements and tractions along the beam width. The accuracy of the proposed model is illustrated by simple examples whose exact solution are known as well as by more complex examples whose numerical results are compared with a well-known finite element code.
Resumo:
Cold-formed steel shapes have been widely employed in steel construction, where they frequently offer a lower cost solution than do traditional laminated shapes. A classic application of cold-formed steel shapes is purlins in the roof panel of industrial buildings, connected to the roof panel by means of screws. The combined effect of these two elements has been the subject of investigations in some countries. Design criteria were included in the AISI Code in 1991 and 1996. This paper presents and discusses the results obtained from bending tests carried out on shapes commonly used in Brazil, i.e., the channel and the simple lipped channel, Tests were carried out on double shapes with 4.5 and 6.0 meter spans, which were subjected to concentrated loads and braced against each other on the supports and at intermediary points in three different load situations. The panel shape was also analyzed experimentally, simulating the action of wind by means of a vacuum box designed specifically for this purpose. The test results were then compared to those obtained through the theoretical analysis, enabling us to extract important information upon which to base proposed design criteria for the new Brazilian code.