891 resultados para Linear and multilinear programming
Resumo:
Includes bibliographies.
Resumo:
This paper is concerned with assessing the interference rejection capabilities of linear and circular array of dipoles that can be part of a base station of a code-division multiple-access cellular communication system. The performance criteria for signal-to-interference ratio (SIR) improvement employed in this paper is the spatial interference suppression coefficient. We first derive an expression for this figure of merit and then analyze and compare the SIR performance of the two types of arrays. For a linear array, we quantitatively assess the degradation in SIR performance, as we move from array broadside to array end-fire direction. In addition, the effect of mutual coupling is taken into account.
Resumo:
The boundary element method (BEM) was used to study galvanic corrosion using linear and logarithmic boundary conditions. The linear boundary condition was implemented by using the linear approach and the piecewise linear approach. The logarithmic boundary condition was implemented by the piecewise linear approach. The calculated potential and current density distribution were compared with the prior analytical results. For the linear boundary condition, the BEASY program using the linear approach and the piecewise linear approach gave accurate predictions of the potential and the galvanic current density distributions for varied electrolyte conditions, various film thicknesses, various electrolyte conductivities and various area ratio of anode/cathode. The 50-point piecewise linear method could be used with both linear and logarithmic polarization curves.
Resumo:
In this paper, the exchange rate forecasting performance of neural network models are evaluated against the random walk, autoregressive moving average and generalised autoregressive conditional heteroskedasticity models. There are no guidelines available that can be used to choose the parameters of neural network models and therefore, the parameters are chosen according to what the researcher considers to be the best. Such an approach, however,implies that the risk of making bad decisions is extremely high, which could explain why in many studies, neural network models do not consistently perform better than their time series counterparts. In this paper, through extensive experimentation, the level of subjectivity in building neural network models is considerably reduced and therefore giving them a better chance of Forecasting exchange rates with linear and nonlinear models 415 performing well. The results show that in general, neural network models perform better than the traditionally used time series models in forecasting exchange rates.
Resumo:
Logistics distribution network design is one of the major decision problems arising in contemporary supply chain management. The decision involves many quantitative and qualitative factors that may be conflicting in nature. This paper applies an integrated multiple criteria decision making approach to design an optimal distribution network. In the approach, the analytic hierarchy process (AHP) is used first to determine the relative importance weightings or priorities of alternative warehouses with respect to both deliverer oriented and customer oriented criteria. Then, the goal programming (GP) model incorporating the constraints of system, resource, and AHP priority is formulated to select the best set of warehouses without exceeding the limited available resources. In this paper, two commercial packages are used: Expert Choice for determining the AHP priorities of the warehouses, and LINDO for solving the GP model. © 2007 IEEE.
Resumo:
We compare the Q parameter obtained from scalar, semi-analytical and full vector models for realistic transmission systems. One set of systems is operated in the linear regime, while another is using solitons at high peak power. We report in detail on the different results obtained for the same system using different models. Polarisation mode dispersion is also taken into account and a novel method to average Q parameters over several independent simulation runs is described. © 2006 Elsevier B.V. All rights reserved.
Resumo:
We compare the Q parameter obtained from the semi-analytical model with scalar and vector models for two realistic transmission systems. First a linear system with a compensated dispersion map and second a soliton transmission system.
Resumo:
In this paper the exchange rate forecasting performance of neural network models are evaluated against random walk and a range of time series models. There are no guidelines available that can be used to choose the parameters of neural network models and therefore the parameters are chosen according to what the researcher considers to be the best. Such an approach, however, implies that the risk of making bad decisions is extremely high which could explain why in many studies neural network models do not consistently perform better than their time series counterparts. In this paper through extensive experimentation the level of subjectivity in building neural network models is considerably reduced and therefore giving them a better chance of performing well. Our results show that in general neural network models perform better than traditionally used time series models in forecasting exchange rates.
Resumo:
Objective: This study aimed to explore methods of assessing interactions between neuronal sources using MEG beamformers. However, beamformer methodology is based on the assumption of no linear long-term source interdependencies [VanVeen BD, vanDrongelen W, Yuchtman M, Suzuki A. Localization of brain electrical activity via linearly constrained minimum variance spatial filtering. IEEE Trans Biomed Eng 1997;44:867-80; Robinson SE, Vrba J. Functional neuroimaging by synthetic aperture magnetometry (SAM). In: Recent advances in Biomagnetism. Sendai: Tohoku University Press; 1999. p. 302-5]. Although such long-term correlations are not efficient and should not be anticipated in a healthy brain [Friston KJ. The labile brain. I. Neuronal transients and nonlinear coupling. Philos Trans R Soc Lond B Biol Sci 2000;355:215-36], transient correlations seem to underlie functional cortical coordination [Singer W. Neuronal synchrony: a versatile code for the definition of relations? Neuron 1999;49-65; Rodriguez E, George N, Lachaux J, Martinerie J, Renault B, Varela F. Perception's shadow: long-distance synchronization of human brain activity. Nature 1999;397:430-3; Bressler SL, Kelso J. Cortical coordination dynamics and cognition. Trends Cogn Sci 2001;5:26-36]. Methods: Two periodic sources were simulated and the effects of transient source correlation on the spatial and temporal performance of the MEG beamformer were examined. Subsequently, the interdependencies of the reconstructed sources were investigated using coherence and phase synchronization analysis based on Mutual Information. Finally, two interacting nonlinear systems served as neuronal sources and their phase interdependencies were studied under realistic measurement conditions. Results: Both the spatial and the temporal beamformer source reconstructions were accurate as long as the transient source correlation did not exceed 30-40 percent of the duration of beamformer analysis. In addition, the interdependencies of periodic sources were preserved by the beamformer and phase synchronization of interacting nonlinear sources could be detected. Conclusions: MEG beamformer methods in conjunction with analysis of source interdependencies could provide accurate spatial and temporal descriptions of interactions between linear and nonlinear neuronal sources. Significance: The proposed methods can be used for the study of interactions between neuronal sources. © 2005 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Resumo:
Hydrogels are a unique class of polymer which swell, but do not dissolve in, water. A range of 2-hydroxyethyl methacrylate based copolymer hydrogels containing both cyclic and linear polyethers have been synthesised and are described in this thesis. Initially, cyclic polyethers were occluded within the polymer matrix and the transport properties investigated. The results indicated that the presence of an ionophore can be used to modulate ion transport and that ion transport is described by a dual-sorption mechanism. However, these studies were limited due to ionophore loss during hydration. Hence, the synthesis of a range of acrylate based crown ether monomers was considered. A pure sample of 4-acryolylaminobenzo-15-crown-5 was obtained and a terpolymer containing this monomer was prepared. Transport studies illustrated that the presence of a `bound' ionophore modulates ion transport in a similar way to the occluded systems. The transport properties of a series of terpolymers containing linear polyethers were then investigated. The results indicated that the dual-sorption mechanism is observed for these systems with group II metal cations while the transport of group I metal cations, with the exception of sodium, is enhanced. Finally, the equilibrium water contents (EWC) surface and mechanical properties of these terpolymers containing linear polyethers were examined. Although subtle variations in EWC are observed as the structure of the polyether side chain varies, generally EWC is enhanced due to the hydrophilicity of the polyether side chain. The macroscopic surface properties were investigated using a sessile drop technique and FTIR spectroscopy. At a molecular level surface properties were probed using an in vitro ocular spoilation model and preliminary cell adhesion studies. The results indicate that the polyethylene oxide side chains are expressed at the polymer surface thus reducing the adhesion of biological species.
Resumo:
Exploratory analysis of data seeks to find common patterns to gain insights into the structure and distribution of the data. In geochemistry it is a valuable means to gain insights into the complicated processes making up a petroleum system. Typically linear visualisation methods like principal components analysis, linked plots, or brushing are used. These methods can not directly be employed when dealing with missing data and they struggle to capture global non-linear structures in the data, however they can do so locally. This thesis discusses a complementary approach based on a non-linear probabilistic model. The generative topographic mapping (GTM) enables the visualisation of the effects of very many variables on a single plot, which is able to incorporate more structure than a two dimensional principal components plot. The model can deal with uncertainty, missing data and allows for the exploration of the non-linear structure in the data. In this thesis a novel approach to initialise the GTM with arbitrary projections is developed. This makes it possible to combine GTM with algorithms like Isomap and fit complex non-linear structure like the Swiss-roll. Another novel extension is the incorporation of prior knowledge about the structure of the covariance matrix. This extension greatly enhances the modelling capabilities of the algorithm resulting in better fit to the data and better imputation capabilities for missing data. Additionally an extensive benchmark study of the missing data imputation capabilities of GTM is performed. Further a novel approach, based on missing data, will be introduced to benchmark the fit of probabilistic visualisation algorithms on unlabelled data. Finally the work is complemented by evaluating the algorithms on real-life datasets from geochemical projects.
Resumo:
We present a new class of multi-channel Fiber Bragg grating, which provides the characteristics of channelized dispersion but does so with only a single reflection band. Such gratings can provide pure phase control of optical pulses without introducing any deleterious insertion-loss-variation. © 2006 Optical Society of America.
Resumo:
The major barrier to practical optimization of pavement preservation programming has always been that for formulations where the identity of individual projects is preserved, the solution space grows exponentially with the problem size to an extent where it can become unmanageable by the traditional analytical optimization techniques within reasonable limit. This has been attributed to the problem of combinatorial explosion that is, exponential growth of the number of combinations. The relatively large number of constraints often presents in a real-life pavement preservation programming problems and the trade-off considerations required between preventive maintenance, rehabilitation and reconstruction, present yet another factor that contributes to the solution complexity. In this research study, a new integrated multi-year optimization procedure was developed to solve network level pavement preservation programming problems, through cost-effectiveness based evolutionary programming analysis, using the Shuffled Complex Evolution (SCE) algorithm.^ A case study problem was analyzed to illustrate the robustness and consistency of the SCE technique in solving network level pavement preservation problems. The output from this program is a list of maintenance and rehabilitation treatment (M&R) strategies for each identified segment of the network in each programming year, and the impact on the overall performance of the network, in terms of the performance levels of the recommended optimal M&R strategy. ^ The results show that the SCE is very efficient and consistent in the simultaneous consideration of the trade-off between various pavement preservation strategies, while preserving the identity of the individual network segments. The flexibility of the technique is also demonstrated, in the sense that, by suitably coding the problem parameters, it can be used to solve several forms of pavement management programming problems. It is recommended that for large networks, some sort of decomposition technique should be applied to aggregate sections, which exhibit similar performance characteristics into links, such that whatever M&R alternative is recommended for a link can be applied to all the sections connected to it. In this way the problem size, and hence the solution time, can be greatly reduced to a more manageable solution space. ^ The study concludes that the robust search characteristics of SCE are well suited for solving the combinatorial problems in long-term network level pavement M&R programming and provides a rich area for future research. ^
Resumo:
Postprint