978 resultados para connection weight approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper focuses on one of the methods for bandwidth allocation in an ATM network: the convolution approach. The convolution approach permits an accurate study of the system load in statistical terms by accumulated calculations, since probabilistic results of the bandwidth allocation can be obtained. Nevertheless, the convolution approach has a high cost in terms of calculation and storage requirements. This aspect makes real-time calculations difficult, so many authors do not consider this approach. With the aim of reducing the cost we propose to use the multinomial distribution function: the enhanced convolution approach (ECA). This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements and makes a simple deconvolution process possible. The ECA is used in connection acceptance control, and some results are presented

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Bayesian approach to analysing data from family-based association studies is developed. This permits direct assessment of the range of possible values of model parameters, such as the recombination frequency and allelic associations, in the light of the data. In addition, sophisticated comparisons of different models may be handled easily, even when such models are not nested. The methodology is developed in such a way as to allow separate inferences to be made about linkage and association by including theta, the recombination fraction between the marker and disease susceptibility locus under study, explicitly in the model. The method is illustrated by application to a previously published data set. The data analysis raises some interesting issues, notably with regard to the weight of evidence necessary to convince us of linkage between a candidate locus and disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel diazirine functionalised aniline derivative, 3-(3-aminophenyl)-3-methyldiazirine 1, was prepared and employed as an AB(2)-type monomer in the synthesis of hyperbranched polymers; thus providing the first instance in which polyamines have been prepared via carbene insertion polymerisation. Photolysis of the monomer 1 in bulk and in solution resulted in the formation of hyperbranched poly(aryl amine)s with degrees of polymerisation (DP) varying from 9 to 26 as determined by gel permeation chromatography (GPC). In solution, an increase in the initial monomer concentration was generally found to result in a decrease in the molecular weight characteristics of the resulting poly(aryl amine) s. Subsequent thermal treatment of the poly(aryl amine) s caused a further increase in the DP values up to a maximum of 31. Nuclear magnetic resonance (NMR) spectroscopic analysis revealed that the increase in molecular weight upon thermal treatment resulted from hydroamination of styrenic species formed in the initial photopolymerisation or activation of diazirine moieties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article reports on part of a larger study of the impact of strategy training in listening on learners of French, aged 16 to 17. One aim of the project was to investigate whether such training might have a positive effect on the self-efficacy of learners, by helping them see the relationship between the strategies they employed and what they achieved. One group of learners, as well as receiving strategy training, also received detailed feedback on their listening strategy use and on the reflective diaries they were asked to keep, in order to draw their attention to the relationship between strategies and learning outcomes. Another group received strategy training without feedback or reflective diaries, while a comparison group received neither strategy training nor feedback. As a result of the training, there was some evidence that students who had received feedback had made the biggest gains in certain aspects of self-efficacy for listening; although their gains as compared to the non-feedback group were not as great as had been anticipated. Reasons for this are discussed. The article concludes by suggesting changes in how teachers approach listening comprehension that may improve learners' view of themselves as listeners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study presents the findings of applying a Discrete Demand Side Control (DDSC) approach to the space heating of two case study buildings. High and low tolerance scenarios are implemented on the space heating controller to assess the impact of DDSC upon buildings with different thermal capacitances, light-weight and heavy-weight construction. Space heating is provided by an electric heat pump powered from a wind turbine, with a back-up electrical network connection in the event of insufficient wind being available when a demand occurs. Findings highlight that thermal comfort is maintained within an acceptable range while the DDSC controller maintains the demand/supply balance. Whilst it is noted that energy demand increases slightly, as this is mostly supplied from the wind turbine, this is of little significance and hence a reduction in operating costs and carbon emissions is still attained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New ways of combining observations with numerical models are discussed in which the size of the state space can be very large, and the model can be highly nonlinear. Also the observations of the system can be related to the model variables in highly nonlinear ways, making this data-assimilation (or inverse) problem highly nonlinear. First we discuss the connection between data assimilation and inverse problems, including regularization. We explore the choice of proposal density in a Particle Filter and show how the ’curse of dimensionality’ might be beaten. In the standard Particle Filter ensembles of model runs are propagated forward in time until observations are encountered, rendering it a pure Monte-Carlo method. In large-dimensional systems this is very inefficient and very large numbers of model runs are needed to solve the data-assimilation problem realistically. In our approach we steer all model runs towards the observations resulting in a much more efficient method. By further ’ensuring almost equal weight’ we avoid performing model runs that are useless in the end. Results are shown for the 40 and 1000 dimensional Lorenz 1995 model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper argues for the relevance of paying attention to structuring participation processes across scales as one of the ways in which participation of multi-organisational partnerships that involve conflicting interests might be managed. Issue wise the paper deals with problems in connection with land mobilisation for road widening in complex and concentrated high value urban settings. It discusses a case study of plan implementation involving individual landowners, the land development market, the local government, other governmental and non-governmental organisations and the state government, which together achieved objectives that seemed impossible at first sight. In theoretical terms, the paper engages with Jessop's (2001) Strategic-Relational Approach (SRA), arguing for its potential for informing action in a way that is capable of achieving steering outputs. The claim for SRA is demonstrated by re-examining the case study. The factors that come through as SRA is applied are drawn out and it is suggested that the theory though non-deterministic, helps guide action by highlighting certain dynamics of systems that can be used for institutional intervention. These dynamics point to the importance of paying attention to scale and the way in which participation and negotiation processes are structured so as to favour certain outcomes rather than others

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Successful innovation diffusion process may well take the form of knowledge transfer process. Therefore, the primary objectives of this paper include: first, to evaluate the interrelations between transfer of knowledge and diffusion of innovation; and second to develop a model to establish a connection between the two. This has been achieved using a four-step approach. The first step of the approach is to assess and discuss the theories relating to knowledge transfer (KT) and innovation diffusion (ID). The second step focuses on developing basic models for KT and ID, based on the key theories surrounding these areas. A considerable amount of literature has been written on the association between knowledge management and innovation, the respective fields of KT and ID. The next step, therefore, explores the relationship between innovation and knowledge management in order to identify the connections between the latter, i.e. KT and ID. Finally, step four proposes and develops an integrated model for KT and ID. As the developed model suggests the sub-processes of knowledge transfer can be connected to the innovation diffusion process in several instances as discussed and illustrated in the paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The warm event which spread in the tropical Atlantic during Spring-Summer 1984 is assumed to be partially initiated by atmospheric disturbances, themselves related to the major 1982–1983 El-Niño which occurred 1 year earlier in the Pacific. This paper tests such an hypothesis. For that purpose, an atmospheric general circulation model (AGCM) is forced by different conditions of climatic and observed sea surface temperature and an Atlantic ocean general circulation model (OGCM) is subsequently forced by the outputs of the AGCM. It is firstly shown that both the AGCM and the OGCM correctly behave when globally observed SST are used: the strengthening of the trades over the tropical Atlantic during 1983 and their subsequent weakening at the beginning of 1984 are well captured by the AGCM, and so is the Spring 1984 deepening of the thermocline in the eastern equatorial Atlantic, simulated by the OGCM. As assumed, the SST anomalies located in the El-Niño Pacific area are partly responsible for wind signal anomaly in the tropical Atlantic. Though this remotely forced atmospheric signal has a small amplitude, it can generate, in the OGCM run, an anomalous sub-surface signal leading to a flattening of the thermocline in the equatorial Atlantic. This forced oceanic experiment cannot explain the amplitude and phase of the observed sub-surface oceanic anomaly: part of the Atlantic ocean response, due to local interaction between ocean and atmosphere, requires a coupled approach. Nevertheless this experiment showed that anomalous conditions in the Pacific during 82–83 created favorable conditions for anomaly development in the Atlantic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This review essay discusses two recent attempts to reform the framework in which issues of international and global justice are discussed: Iris Marion Young’s ‘social connection’ model and the practice-dependent approach, here exemplified by Ayelet Banai, Miriam Ronzoni and Christian Schemmel’s edited collection. I argue that while Young’s model may fit some issues of international or global justice, it misconceives the problems that many of them pose. Indeed, its difficulties point precisely in the direction of practice dependence as it is presented by Banai et al. I go on to discuss what seem to be the strengths of that method, and particularly Banai et al.’s defence of it against the common claim that it is biased towards the status quo. I also discuss Andrea Sangiovanni and Kate MacDonald’s contributions to the collection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a novel adaptive multiple modelling algorithm for non-linear and non-stationary systems. This simple modelling paradigm comprises K candidate sub-models which are all linear. With data available in an online fashion, the performance of all candidate sub-models are monitored based on the most recent data window, and M best sub-models are selected from the K candidates. The weight coefficients of the selected sub-model are adapted via the recursive least square (RLS) algorithm, while the coefficients of the remaining sub-models are unchanged. These M model predictions are then optimally combined to produce the multi-model output. We propose to minimise the mean square error based on a recent data window, and apply the sum to one constraint to the combination parameters, leading to a closed-form solution, so that maximal computational efficiency can be achieved. In addition, at each time step, the model prediction is chosen from either the resultant multiple model or the best sub-model, whichever is the best. Simulation results are given in comparison with some typical alternatives, including the linear RLS algorithm and a number of online non-linear approaches, in terms of modelling performance and time consumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasing efforts exist in integrating different levels of detail in models of the cardiovascular system. For instance, one-dimensional representations are employed to model the systemic circulation. In this context, effective and black-box-type decomposition strategies for one-dimensional networks are needed, so as to: (i) employ domain decomposition strategies for large systemic models (1D-1D coupling) and (ii) provide the conceptual basis for dimensionally-heterogeneous representations (1D-3D coupling, among various possibilities). The strategy proposed in this article works for both of these two scenarios, though the several applications shown to illustrate its performance focus on the 1D-1D coupling case. A one-dimensional network is decomposed in such a way that each coupling point connects two (and not more) of the sub-networks. At each of the M connection points two unknowns are defined: the flow rate and pressure. These 2M unknowns are determined by 2M equations, since each sub-network provides one (non-linear) equation per coupling point. It is shown how to build the 2M x 2M non-linear system with arbitrary and independent choice of boundary conditions for each of the sub-networks. The idea is then to solve this non-linear system until convergence, which guarantees strong coupling of the complete network. In other words, if the non-linear solver converges at each time step, the solution coincides with what would be obtained by monolithically modeling the whole network. The decomposition thus imposes no stability restriction on the choice of the time step size. Effective iterative strategies for the non-linear system that preserve the black-box character of the decomposition are then explored. Several variants of matrix-free Broyden`s and Newton-GMRES algorithms are assessed as numerical solvers by comparing their performance on sub-critical wave propagation problems which range from academic test cases to realistic cardiovascular applications. A specific variant of Broyden`s algorithm is identified and recommended on the basis of its computer cost and reliability. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the connection between information and copula theories by showing that a copula can be employed to decompose the information content of a multivariate distribution into marginal and dependence components, with the latter quantified by the mutual information. We define the information excess as a measure of deviation from a maximum-entropy distribution. The idea of marginal invariant dependence measures is also discussed and used to show that empirical linear correlation underestimates the amplitude of the actual correlation in the case of non-Gaussian marginals. The mutual information is shown to provide an upper bound for the asymptotic empirical log-likelihood of a copula. An analytical expression for the information excess of T-copulas is provided, allowing for simple model identification within this family. We illustrate the framework in a financial data set. Copyright (C) EPLA, 2009

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Detecting both the majors genes that control the phenotypic mean and those controlling phenotypic variance has been raised in quantitative trait loci analysis. In order to mapping both kinds of genes, we applied the idea of the classic Haley-Knott regression to double generalized linear models. We performed both kinds of quantitative trait loci detection for a Red Jungle Fowl x White Leghorn F2 intercross using double generalized linear models. It is shown that double generalized linear model is a proper and efficient approach for localizing variance-controlling genes. We compared two models with or without fixed sex effect and prefer including the sex effect in order to reduce the residual variances. We found that different genes might take effect on the body weight at different time as the chicken grows.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Weight loss reduces blood pressure, and the Dietary Approaches to Stop Hypertension (DASH) diet has also been shown to lower blood pressure.

Objective: Our goal was to assess the effect on blood pressure of 2 weight-reduction diets: a low-fat diet (LF diet) and a moderate-sodium, high-potassium, high-calcium, low-fat DASH diet (WELL diet).

Design: After baseline measurements, 63 men were randomly assigned to either the WELL or the LF diet for 12 wk, and both diet groups undertook 0.5 h of moderate physical activity on most days of the week.

Results: Fifty-four men completed the study. Their mean (±SD) age was 47.9 ± 9.3 y (WELL diet, n = 27; LF diet, n = 27), and their mean baseline home systolic and diastolic blood pressures were 129.4 ± 11.3 and 80.6 ± 8.6 mm Hg, respectively. Body weight decreased by 4.9 ± 0.6 kg (±SEM) in the WELL group and by 4.6 ± 0.6 kg in the LF group (P < 0.001 for both). There was a greater decrease in blood pressure in the WELL group than in the LF group [between-group difference (week 12 –baseline) in both SBP (5.5 ± 1.9 mm Hg; P = 0.006) and DBP (4.4 ± 1.2 mm Hg; P = 0.001)].

Conclusions: For a comparable 5-kg weight loss, a diet high in low-fat dairy products, vegetables, and fruit (the WELL diet) resulted in a greater decrease in blood pressure than did the LF diet. This dietary approach to achieving weight reduction may confer an additional benefit in reducing blood pressure in those who are overweight.