930 resultados para Linear and nonlinear correlation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces a new neurofuzzy model construction and parameter estimation algorithm from observed finite data sets, based on a Takagi and Sugeno (T-S) inference mechanism and a new extended Gram-Schmidt orthogonal decomposition algorithm, for the modeling of a priori unknown dynamical systems in the form of a set of fuzzy rules. The first contribution of the paper is the introduction of a one to one mapping between a fuzzy rule-base and a model matrix feature subspace using the T-S inference mechanism. This link enables the numerical properties associated with a rule-based matrix subspace, the relationships amongst these matrix subspaces, and the correlation between the output vector and a rule-base matrix subspace, to be investigated and extracted as rule-based knowledge to enhance model transparency. The matrix subspace spanned by a fuzzy rule is initially derived as the input regression matrix multiplied by a weighting matrix that consists of the corresponding fuzzy membership functions over the training data set. Model transparency is explored by the derivation of an equivalence between an A-optimality experimental design criterion of the weighting matrix and the average model output sensitivity to the fuzzy rule, so that rule-bases can be effectively measured by their identifiability via the A-optimality experimental design criterion. The A-optimality experimental design criterion of the weighting matrices of fuzzy rules is used to construct an initial model rule-base. An extended Gram-Schmidt algorithm is then developed to estimate the parameter vector for each rule. This new algorithm decomposes the model rule-bases via an orthogonal subspace decomposition approach, so as to enhance model transparency with the capability of interpreting the derived rule-base energy level. This new approach is computationally simpler than the conventional Gram-Schmidt algorithm for resolving high dimensional regression problems, whereby it is computationally desirable to decompose complex models into a few submodels rather than a single model with large number of input variables and the associated curse of dimensionality problem. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examined Na+–H+exchanger isoform 1 (NHE-1) mRNA expression in ventricular myocardium and its correlation with sarcolemmal NHE activity in isolated ventricular myocytes, during postnatal development in the rat. The expression of glyceraldehyde-3-phosphate dehydrogenase (GAPDH) mRNA did not change in ventricular myocardium between 2 and 42 days after birth. Therefore, at seven time points within that age range, GAPDH expression was used to normalize NHE-1 mRNA levels, as determined by reverse transcription polymerase chain reaction analysis. There was a progressive five-fold reduction in NHE-1 mRNA expression in ventricular myocardium from 2 days to 42 days of age. As an index of NHE activity, acid efflux rates (JH) were determined in single neonatal (2–4-day-old) and adult (42-day-old) ventricular myocytes (n=16/group) loaded with the pH fluoroprobe carboxy-seminaphthorhodafluor-1. In HEPES-buffered medium, basal intracellular pH (pHi) was similar at 7.28±0.02 in neonatal and 7.31±0.02 in adult myocytes, but intrinsic buffering power was lower in the former age group. The rate at which pHirecovered from a similar acid load was significantly greater in neonatal than in adult myocytes (0.36±0.07v0.16±0.02 pH units/min at pHi=6.8). This was reflected by a significantly greaterJH(22±4v9±1 pmol/cm2/s at pHi=6.8), indicating greater sarcolemmal NHE activity in neonatal myocytes. The concomitant reductions in tissue NHE-1 mRNA expression and sarcolemmal NHE activity suggest that myocardial NHE-1 is subject to regulation at the mRNA level during postnatal development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The conformation of a model peptide AAKLVFF based on a fragment of the amyloid beta peptide A beta 16-20, KLVFF, is investigated in methanol and water via solution NMR experiments and Molecular dynamics computer simulations. In previous work, we have shown that AAKLVFF forms peptide nanotubes in methanol and twisted fibrils in water. Chemical shift measurements were used to investigate the solubility of the peptide as a function of concentration in methanol and water. This enabled the determination of critical aggregation concentrations, The Solubility was lower in water. In dilute solution, diffusion coefficients revealed the presence of intermediate aggregates in concentrated solution, coexisting with NMR-silent larger aggregates, presumed to be beta-sheets. In water, diffusion coefficients did not change appreciably with concentration, indicating the presence mainly of monomers, coexisting with larger aggregates in more concentrated solution. Concentration-dependent chemical shift measurements indicated a folded conformation for the monomers/intermediate aggregates in dilute methanol, with unfolding at higher concentration. In water, an antiparallel arrangement of strands was indicated by certain ROESY peak correlations. The temperature-dependent solubility of AAKLVFF in methanol was well described by a van't Hoff analysis, providing a solubilization enthalpy and entropy. This pointed to the importance of solvophobic interactions in the self-assembly process. Molecular dynamics Simulations constrained by NOE values from NMR suggested disordered reverse turn structures for the monomer, with an antiparallel twisted conformation for dimers. To model the beta-sheet structures formed at higher concentration, possible model arrangements of strands into beta-sheets with parallel and antiparallel configurations and different stacking sequences were used as the basis for MD simulations; two particular arrangements of antiparallel beta-sheets were found to be stable, one being linear and twisted and the other twisted in two directions. These structures Were used to simulate Circular dichroism spectra. The roles of aromatic stacking interactions and charge transfer effects were also examined. Simulated spectra were found to be similar to those observed experimentally.(in water or methanol) which show a maximum at 215 or 218 nm due to pi-pi* interactions, when allowance is made for a 15-18 nm red-shift that may be due to light scattering effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Theoretical models suggest that decisions about diet, weight and health status are endogenous within a utility maximization framework. In this article, we model these behavioural relationships in a fixed-effect panel setting using a simultaneous equation system, with a view to determining whether economic variables can explain the trends in calorie consumption, obesity and health in Organization for Economic Cooperation and Development (OECD) countries and the large differences among the countries. The empirical model shows that progress in medical treatment and health expenditure mitigates mortality from diet-related diseases, despite rising obesity rates. While the model accounts for endogeneity and serial correlation, results are affected by data limitations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A nonlinear general predictive controller (NLGPC) is described which is based on the use of a Hammerstein model within a recursive control algorithm. A key contribution of the paper is the use of a novel, one-step simple root solving procedure for the Hammerstein model, this being a fundamental part of the overall tuning algorithm. A comparison is made between NLGPC and nonlinear deadbeat control (NLDBC) using the same one-step nonlinear components, in order to investigate NLGPC advantages and disadvantages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analyzes the use of linear and neural network models for financial distress classification, with emphasis on the issues of input variable selection and model pruning. A data-driven method for selecting input variables (financial ratios, in this case) is proposed. A case study involving 60 British firms in the period 1997-2000 is used for illustration. It is shown that the use of the Optimal Brain Damage pruning technique can considerably improve the generalization ability of a neural model. Moreover, the set of financial ratios obtained with the proposed selection procedure is shown to be an appropriate alternative to the ratios usually employed by practitioners.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Huntington’s disease (HD) is a fatal, neurodegenerative disease for which there is no known cure. Proxy evaluation is relevant for HD as its manifestation might limit the ability of persons to report their health-related quality of life (HrQoL). This study explored patient–proxy ratings of HrQoL of persons at different stages of HD, and examined factors that may affect proxy ratings. A total of 105 patient–proxy pairs completed the Huntington’s disease health-related quality of life questionnaire (HDQoL) and other established HrQoL measures (EQ-5D and SF-12v2). Proxy–patient agreement was assessed in terms of absolute level (mean ratings) and intraclass correlation. Proxies’ ratings were at a similar level to patients’ self-ratings on an overall Summary Score and on most of the six Specific Scales of the HDQoL. On the Specific Hopes and Worries Scale, proxies on average rated HrQoL as better than patients’ self-ratings, while on both the Specific Cognitive Scale and Specific Physical and Functional Scale proxies tended to rate HrQoL more poorly than patients themselves. The patient’s disease stage and mental wellbeing (SF-12 Mental Component scale) were the two factors that primarily affected proxy assessment. Proxy scores were strongly correlated with patients’ self-ratings of HrQoL, on the Summary Scale and all Specific Scales. The patient–proxy correlation was lower for patients at moderate stages of HD compared to patients at early and advanced stages. The proxy report version of the HDQoL is a useful complementary tool to self-assessment, and a promising alternative when individual patients with advanced HD are unable to self-report.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A direct method is presented for determining the uncertainty in reservoir pressure, flow, and net present value (NPV) using the time-dependent, one phase, two- or three-dimensional equations of flow through a porous medium. The uncertainty in the solution is modelled as a probability distribution function and is computed from given statistical data for input parameters such as permeability. The method generates an expansion for the mean of the pressure about a deterministic solution to the system equations using a perturbation to the mean of the input parameters. Hierarchical equations that define approximations to the mean solution at each point and to the field covariance of the pressure are developed and solved numerically. The procedure is then used to find the statistics of the flow and the risked value of the field, defined by the NPV, for a given development scenario. This method involves only one (albeit complicated) solution of the equations and contrasts with the more usual Monte-Carlo approach where many such solutions are required. The procedure is applied easily to other physical systems modelled by linear or nonlinear partial differential equations with uncertain data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents two schemes of measuring the linear and angular kinematics of a rigid body using a kinematically redundant array of triple-axis accelerometers with potential applications in biomechanics. A novel angular velocity estimation algorithm is proposed and evaluated that can compensate for angular velocity errors using measurements of the direction of gravity. Analysis and discussion of optimal sensor array characteristics are provided. A damped 2 axis pendulum was used to excite all 6 DoF of the a suspended accelerometer array through determined complex motion and is the basis of both simulation and experimental studies. The relationship between accuracy and sensor redundancy is investigated for arrays of up to 100 triple axis (300 accelerometer axes) accelerometers in simulation and 10 equivalent sensors (30 accelerometer axes) in the laboratory test rig. The paper also reports on the sensor calibration techniques and hardware implementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ensemble-based data assimilation is rapidly proving itself as a computationally-efficient and skilful assimilation method for numerical weather prediction, which can provide a viable alternative to more established variational assimilation techniques. However, a fundamental shortcoming of ensemble techniques is that the resulting analysis increments can only span a limited subspace of the state space, whose dimension is less than the ensemble size. This limits the amount of observational information that can effectively constrain the analysis. In this paper, a data selection strategy that aims to assimilate only the observational components that matter most and that can be used with both stochastic and deterministic ensemble filters is presented. This avoids unnecessary computations, reduces round-off errors and minimizes the risk of importing observation bias in the analysis. When an ensemble-based assimilation technique is used to assimilate high-density observations, the data-selection procedure allows the use of larger localization domains that may lead to a more balanced analysis. Results from the use of this data selection technique with a two-dimensional linear and a nonlinear advection model using both in situ and remote sounding observations are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cross-layer design is a generic designation for a set of efficient adaptive transmission schemes, across multiple layers of the protocol stack, that are aimed at enhancing the spectral efficiency and increasing the transmission reliability of wireless communication systems. In this paper, one such cross-layer design scheme that combines physical layer adaptive modulation and coding (AMC) with link layer truncated automatic repeat request (T-ARQ) is proposed for multiple-input multiple-output (MIMO) systems employing orthogonal space--time block coding (OSTBC). The performance of the proposed cross-layer design is evaluated in terms of achievable average spectral efficiency (ASE), average packet loss rate (PLR) and outage probability, for which analytical expressions are derived, considering transmission over two types of MIMO fading channels, namely, spatially correlated Nakagami-m fading channels and keyhole Nakagami-m fading channels. Furthermore, the effects of the maximum number of ARQ retransmissions, numbers of transmit and receive antennas, Nakagami fading parameter and spatial correlation parameters, are studied and discussed based on numerical results and comparisons. Copyright © 2009 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

NO2 measurements during 1990–2007, obtained from a zenith-sky spectrometer in the Antarctic, are analysed to determine the long-term changes in NO2. An atmospheric photochemical box model and a radiative transfer model are used to improve the accuracy of determination of the vertical columns from the slant column measurements, and to deduce the amount of NOy from NO2. We find that the NO2 and NOy columns in midsummer have large inter-annual variability superimposed on a broad maximum in 2000, with little or no overall trend over the full time period. These changes are robust to a variety of alternative settings when determining vertical columns from slant columns or determining NOy from NO2. They may signify similar changes in speed of the Brewer-Dobson circulation but with opposite sign, i.e. a broad minimum around 2000. Multiple regressions show significant correlation with solar and quasi-biennial-oscillation indices, and weak correlation with El Nino, but no significant overall trend, corresponding to an increase in Brewer-Dobson circulation of 1.4±3.5%/decade. There remains an unexplained cycle of amplitude and period at least 15% and 17 years, with minimum speed in about 2000.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this paper is to apply the mis-specification (M-S) encompassing perspective to the problem of choosing between linear and log-linear unit-root models. A simple M-S encompassing test, based on an auxiliary regression stemming from the conditional second moment, is proposed and its empirical size and power are investigated using Monte Carlo simulations. It is shown that by focusing on the conditional process the sampling distributions of the relevant statistics are well behaved under both the null and alternative hypotheses. The proposed M-S encompassing test is illustrated using US total disposable income quarterly data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We explore the mutual dependencies and interactions among different groups of species of the plankton population, based on an analysis of the long-term field observations carried out by our group in the North–West coast of the Bay of Bengal. The plankton community is structured into three groups of species, namely, non-toxic phytoplankton (NTP), toxic phytoplankton (TPP) and zooplankton. To find the pair-wise dependencies among the three groups of plankton, Pearson and partial correlation coefficients are calculated. To explore the simultaneous interaction among all the three groups, a time series analysis is performed. Following an Expectation Maximization (E-M) algorithm, those data points which are missing due to irregularities in sampling are estimated, and with the completed data set a Vector Auto-Regressive (VAR) model is analyzed. The overall analysis demonstrates that toxin-producing phytoplankton play two distinct roles: the inhibition on consumption of toxic substances reduces the abundance of zooplankton, and the toxic materials released by TPP significantly compensate for the competitive disadvantages among phytoplankton species. Our study suggests that the presence of TPP might be a possible cause for the generation of a complex interaction among the large number of phytoplankton and zooplankton species that might be responsible for the prolonged coexistence of the plankton species in a fluctuating biomass.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rhythms are manifested ubiquitously in dynamical biological processes. These fundamental processes which are necessary for the survival of living organisms include metabolism, breathing, heart beat, and, above all, the circadian rhythm coupled to the diurnal cycle. Thus, in mathematical biology, biological processes are often represented as linear or nonlinear oscillators. In the framework of nonlinear and dissipative systems (ie. the flow of energy, substances, or sensory information), they generate stable internal oscillations as a response to environmental input and, in turn, utilise such output as a means of coupling with the environment.