863 resultados para critical path methods
Resumo:
This paper develops cycle-level FPGA circuits of an organization for a fast path-based neural branch predictor Our results suggest that practical sizes of prediction tables are limited to around 32 KB to 64 KB in current FPGA technology due mainly to FPGA area of logic resources to maintain the tables. However the predictor scales well in terms of prediction speed. Table sizes alone should not be used as the only metric for hardware budget when comparing neural-based predictor to predictors of totally different organizations. This paper also gives early evidence to shift the attention on to the recovery from mis-prediction latency rather than on prediction latency as the most critical factor impacting accuracy of predictions for this class of branch predictors.
Resumo:
Dual Carrier Modulation (DCM) was chosen as the higher data rate modulation scheme for MB-OFDM (Multiband Orthogonal Frequency Division Multiplexing) in the UWB (Ultra-Wide Band) radio platform ECMA-368. ECMA-368 has been chosen as the physical implementation for high data rate Wireless USB (W-USB) and Bluetooth 3.0. In this paper, different demapping methods for the DCM demapper are presented, being Soft Bit, Maximum Likely (ML) Soft Bit and Log Likelihood Ratio (LLR). Frequency diversity and Channel State Information (CSI) are further techniques to enhance demapping methods. The system performance for those DCM demapping methods simulated in realistic multi-path environments are provided and compared.
Resumo:
This article summarises recent revisions to the investment development path (IDP) as postulated by Narula and Dunning (2010). The IDP provides a framework to understand the dynamic interaction between foreign direct investment (FDI) and economic development. The revisions take into account some recent changes in the global economic environment. This paper argues that studies based on the IDP should adopt a broader perspective, encompassing the idiosyncratic economic structure of countries as well as the heterogeneous nature of FDI. It is critical to understand the complex forces and interactions that determine the turning points in a country’s IDP, and to more explicitly acknowledge the role of historical, social and political circumstances in hindering or promoting FDI. We discuss some of the implications for Eastern European countries and provide some guidelines for future research.
Resumo:
The paper reviews the leading diagramming methods employed in system dynamics to communicate the contents of models. The main ideas and historical development of the field are first outlined. Two diagramming methods—causal loop diagrams (CLDs) and stock/flow diagrams (SFDs)—are then described and their advantages and limitations discussed. A set of broad research directions is then outlined. These concern: the abilities of different diagrams to communicate different ideas, the role that diagrams have in group model building, and the question of whether diagrams can be an adequate substitute for simulation modelling. The paper closes by suggesting that although diagrams alone are insufficient, they have many benefits. However, since these benefits have emerged only as ‘craft wisdom’, a more rigorous programme of research into the diagrams' respective attributes is called for.
Resumo:
Simultaneous scintillometer measurements at multiple wavelengths (pairing visible or infrared with millimetre or radio waves) have the potential to provide estimates of path-averaged surface fluxes of sensible and latent heat. Traditionally, the equations to deduce fluxes from measurements of the refractive index structure parameter at the two wavelengths have been formulated in terms of absolute humidity. Here, it is shown that formulation in terms of specific humidity has several advantages. Specific humidity satisfies the requirement for a conserved variable in similarity theory and inherently accounts for density effects misapportioned through the use of absolute humidity. The validity and interpretation of both formulations are assessed and the analogy with open-path infrared gas analyser density corrections is discussed. Original derivations using absolute humidity to represent the influence of water vapour are shown to misrepresent the latent heat flux. The errors in the flux, which depend on the Bowen ratio (larger for drier conditions), may be of the order of 10%. The sensible heat flux is shown to remain unchanged. It is also verified that use of a single scintillometer at optical wavelengths is essentially unaffected by these new formulations. Where it may not be possible to reprocess two-wavelength results, a density correction to the latent heat flux is proposed for scintillometry, which can be applied retrospectively to reduce the error.
Resumo:
From Milsom's equations, which describe the geometry of ray-path hops reflected from the ionospheric F-layer, algorithms for the simplified estimation of mirror-reflection height are developed. These allow for hop length and the effects of variations in underlying ionisation (via the ratio of the F2- and E-layer critical frequencies) and F2-layer peak height (via the M(3000)F2-factor). Separate algorithms are presented which are applicable to a range of signal frequencies about the FOT and to propagation at the MUF. The accuracies and complexities of the algorithms are compared with those inherent in the use of a procedure based on an equation developed by Shimazaki.
Resumo:
Dominant paradigms of causal explanation for why and how Western liberal-democracies go to war in the post-Cold War era remain versions of the 'liberal peace' or 'democratic peace' thesis. Yet such explanations have been shown to rest upon deeply problematic epistemological and methodological assumptions. Of equal importance, however, is the failure of these dominant paradigms to account for the 'neoliberal revolution' that has gripped Western liberal-democracies since the 1970s. The transition from liberalism to neoliberalism remains neglected in analyses of the contemporary Western security constellation. Arguing that neoliberalism can be understood simultaneously through the Marxian concept of ideology and the Foucauldian concept of governmentality – that is, as a complementary set of 'ways of seeing' and 'ways of being' – the thesis goes on to analyse British security in policy and practice, considering it as an instantiation of a wider neoliberal way of war. In so doing, the thesis draws upon, but also challenges and develops, established critical discourse analytic methods, incorporating within its purview not only the textual data that is usually considered by discourse analysts, but also material practices of security. This analysis finds that contemporary British security policy is predicated on a neoliberal social ontology, morphology and morality – an ideology or 'way of seeing' – focused on the notion of a globalised 'network-market', and is aimed at rendering circulations through this network-market amenable to neoliberal techniques of government. It is further argued that security practices shaped by this ideology imperfectly and unevenly achieve the realisation of neoliberal 'ways of being' – especially modes of governing self and other or the 'conduct of conduct' – and the re-articulation of subjectivities in line with neoliberal principles of individualism, risk, responsibility and flexibility. The policy and practice of contemporary British 'security' is thus recontextualised as a component of a broader 'neoliberal way of war'.
Resumo:
Elucidating the biological and biochemical roles of proteins, and subsequently determining their interacting partners, can be difficult and time consuming using in vitro and/or in vivo methods, and consequently the majority of newly sequenced proteins will have unknown structures and functions. However, in silico methods for predicting protein–ligand binding sites and protein biochemical functions offer an alternative practical solution. The characterisation of protein–ligand binding sites is essential for investigating new functional roles, which can impact the major biological research spheres of health, food, and energy security. In this review we discuss the role in silico methods play in 3D modelling of protein–ligand binding sites, along with their role in predicting biochemical functionality. In addition, we describe in detail some of the key alternative in silico prediction approaches that are available, as well as discussing the Critical Assessment of Techniques for Protein Structure Prediction (CASP) and the Continuous Automated Model EvaluatiOn (CAMEO) projects, and their impact on developments in the field. Furthermore, we discuss the importance of protein function prediction methods for tackling 21st century problems.
Resumo:
In this paper we establish the existence of standing wave solutions for quasilinear Schrodinger equations involving critical growth. By using a change of variables, the quasilinear equations are reduced to semilinear one. whose associated functionals are well defined in the usual Sobolev space and satisfy the geometric conditions of the mountain pass theorem. Using this fact, we obtain a Cerami sequence converging weakly to a solution v. In the proof that v is nontrivial, the main tool is the concentration-compactness principle due to P.L. Lions together with some classical arguments used by H. Brezis and L. Nirenberg (1983) in [9]. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
This paper addresses the independent multi-plant, multi-period, and multi-item capacitated lot sizing problem where transfers between the plants are allowed. This is an NP-hard combinatorial optimization problem and few solution methods have been proposed to solve it. We develop a GRASP (Greedy Randomized Adaptive Search Procedure) heuristic as well as a path-relinking intensification procedure to find cost-effective solutions for this problem. In addition, the proposed heuristics is used to solve some instances of the capacitated lot sizing problem with parallel machines. The results of the computational tests show that the proposed heuristics outperform other heuristics previously described in the literature. The results are confirmed by statistical tests. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
A classical theorem of H. Hopf asserts that a closed connected smooth manifold admits a nowhere vanishing vector field if and only if its Euler characteristic is zero. R. Brown generalized Hopf`s result to topological manifolds, replacing vector fields with path fields. In this note, we give an equivariant analog of Brown`s theorem for locally smooth G-manifolds where G is a finite group.
Resumo:
Potentially useful stead-state fluorimetric technique was used to determine the critical micellar concentrations (CMC(1) and CMC(2)) for two micellar media, one formed by SDS and the other by SDS/Brij 30. A comparative study based on conductimetric and surfacial tension measurements suggests that the CMC(1) estimated by the fluorimetric method is lower than the value estimated by these other techniques. Equivalent values were observed for SDS micelles without Brij 30 neutral co-surfactant. The use of acridine orange as fluorescent probe permitted to determine both CMC(1) and CMC(2). Based on it an explanation on aspects of micelle formation mechanism is presented, particularly based on a spherical and a rod like structures.
Resumo:
A number of recent works have introduced statistical methods for detecting genetic loci that affect phenotypic variability, which we refer to as variability-controlling quantitative trait loci (vQTL). These are genetic variants whose allelic state predicts how much phenotype values will vary about their expected means. Such loci are of great potential interest in both human and non-human genetic studies, one reason being that a detected vQTL could represent a previously undetected interaction with other genes or environmental factors. The simultaneous publication of these new methods in different journals has in many cases precluded opportunity for comparison. We survey some of these methods, the respective trade-offs they imply, and the connections between them. The methods fall into three main groups: classical non-parametric, fully parametric, and semi-parametric two-stage approximations. Choosing between alternatives involves balancing the need for robustness, flexibility, and speed. For each method, we identify important assumptions and limitations, including those of practical importance, such as their scope for including covariates and random effects. We show in simulations that both parametric methods and their semi-parametric approximations can give elevated false positive rates when they ignore mean-variance relationships intrinsic to the data generation process. We conclude that choice of method depends on the trait distribution, the need to include non-genetic covariates, and the population size and structure, coupled with a critical evaluation of how these fit with the assumptions of the statistical model.
Formação de agenda e formulação de uma política pública no Brasil: o caso do Fundo Social do Pré-Sal
Resumo:
Entender o processo que engendra o Fundo Social do Pré-Sal demanda compreender a nova regulação do petróleo, aprovada em 2010. Esta tese empreende análise da trajetória da regulação do petróleo no Brasil, partindo da aprovação da Lei do Petróleo (1997), e avalia como ideias e interesses interagem com instituições vigentes, ação política e condições econômicas, para gerar nova configuração setorial. Combinando visão abrangente do institucionalismo histórico aos métodos robustos do institucionalismo da escolha racional para determinação das preferências dos atores, este trabalho aponta para a determinação histórica das escolhas dos agentes. Em termos empíricos, mostra como as condições econômicas foram determinantes na permeabilidade da política econômica brasileira aos interesses representados pelas instituições multilaterais, esclarece a importância do volume de reservas de petróleo na determinação do novo marco regulatório; e aponta razão para a escalada de valor da Petrobras após 1997.
Resumo:
Researchers often rely on the t-statistic to make inference on parameters in statistical models. It is common practice to obtain critical values by simulation techniques. This paper proposes a novel numerical method to obtain an approximately similar test. This test rejects the null hypothesis when the test statistic islarger than a critical value function (CVF) of the data. We illustrate this procedure when regressors are highly persistent, a case in which commonly-used simulation methods encounter dificulties controlling size uniformly. Our approach works satisfactorily, controls size, and yields a test which outperforms the two other known similar tests.