28 resultados para Computational Intelligence in data-driven and hybrid Models and Data Analysis

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A dry matrix application for matrix-assisted laser desorption/ionization mass spectrometry imaging (MALDI MSI) was used to profile the distribution of 4-bromophenyl-1,4-diazabicyclo(3.2.2)nonane-4-carboxylate, monohydrochloride (BDNC, SSR180711) in rat brain tissue sections. Matrix application involved applying layers of finely ground dry alpha-cyano-4-hydroxycinnamic acid (CHCA) to the surface of tissue sections thaw mounted onto MALDI targets. It was not possible to detect the drug when applying matrix in a standard aqueous-organic solvent solution. The drug was detected at higher concentrations in specific regions of the brain, particularly the white matter of the cerebellum. Pseudomultiple reaction monitoring imaging was used to validate that the observed distribution was the target compound. The semiquantitative data obtained from signal intensities in the imaging was confirmed by laser microdissection of specific regions of the brain directed by the imaging, followed by hydrophilic interaction chromatography in combination with a quantitative high-resolution mass spectrometry method. This study illustrates that a dry matrix coating is a valuable and complementary matrix application method for analysis of small polar drugs and metabolites that can be used for semiquantitative analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Signal integration determines cell fate on the cellular level, affects cognitive processes and affective responses on the behavioural level, and is likely to be involved in psychoneurobiological processes underlying mood disorders. Interactions between stimuli may subjected to time effects. Time-dependencies of interactions between stimuli typically lead to complex cell responses and complex responses on the behavioural level. We show that both three-factor models and time series models can be used to uncover such time-dependencies. However, we argue that for short longitudinal data the three factor modelling approach is more suitable. In order to illustrate both approaches, we re-analysed previously published short longitudinal data sets. We found that in human embryonic kidney 293 cells cells the interaction effect in the regulation of extracellular signal-regulated kinase (ERK) 1 signalling activation by insulin and epidermal growth factor is subjected to a time effect and dramatically decays at peak values of ERK activation. In contrast, we found that the interaction effect induced by hypoxia and tumour necrosis factor-alpha for the transcriptional activity of the human cyclo-oxygenase-2 promoter in HEK293 cells is time invariant at least in the first 12-h time window after stimulation. Furthermore, we applied the three-factor model to previously reported animal studies. In these studies, memory storage was found to be subjected to an interaction effect of the beta-adrenoceptor agonist clenbuterol and certain antagonists acting on the alpha-1-adrenoceptor / glucocorticoid-receptor system. Our model-based analysis suggests that only if the antagonist drug is administer in a critical time window, then the interaction effect is relevant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explains some drawbacks on previous approaches for detecting influential observations in deterministic nonparametric data envelopment analysis models as developed by Yang et al. (Annals of Operations Research 173:89-103, 2010). For example efficiency scores and relative entropies obtained in this model are unimportant to outlier detection and the empirical distribution of all estimated relative entropies is not a Monte-Carlo approximation. In this paper we developed a new method to detect whether a specific DMU is truly influential and a statistical test has been applied to determine the significance level. An application for measuring efficiency of hospitals is used to show the superiority of this method that leads to significant advancements in outlier detection. © 2014 Springer Science+Business Media New York.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The profusion of performance measurement models suggested by Management Accounting literature in the 1990’s is one illustration of the substantial changes in Management Accounting teaching materials since the publication of “Relevance Lost” in 1987. At the same time, in the general context of increasing competition and globalisation it is widely thought that national cultural differences are tending to disappear, meaning that management techniques used in large companies, including performance measurement and management instruments (PMS), tend to be the same, irrespective of the company nationality or location. North American management practice is traditionally described as a contractually based model, mainly focused on financial performance information and measures (FPMs), more shareholder-focused than French companies. Within France, literature historically defined performance as being broadly multidimensional, driven by the idea that there are no universal rules of management and that efficient management takes into account local culture and traditions. As opposed to their North American brethren, French companies are pressured more by the financial institutions that fund them rather than by capital markets. Therefore, they pay greater attention to the long-term because they are not subject to quarterly capital market objectives. Hence, management in France should rely more on long-term qualitative information, less financial, and more multidimensional data to assess performance than their North American counterparts. The objective of this research is to investigate whether large French and US companies’ practices have changed in the way the textbooks have changed with regards to performance measurement and management, or whether cultural differences are still driving differences in performance measurement and management between them. The research findings support the idea that large US and French companies share the same PMS features, influenced by ‘universal’ PM models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent experimental studies have shown that development towards adult performance levels in configural processing in object recognition is delayed through middle childhood. Whilst partchanges to animal and artefact stimuli are processed with similar to adult levels of accuracy from 7 years of age, relative size changes to stimuli result in a significant decrease in relative performance for participants aged between 7 and 10. Two sets of computational experiments were run using the JIM3 artificial neural network with adult and 'immature' versions to simulate these results. One set progressively decreased the number of neurons involved in the representation of view-independent metric relations within multi-geon objects. A second set of computational experiments involved decreasing the number of neurons that represent view-dependent (nonrelational) object attributes in JIM3's Surface Map. The simulation results which show the best qualitative match to empirical data occurred when artificial neurons representing metric-precision relations were entirely eliminated. These results therefore provide further evidence for the late development of relational processing in object recognition and suggest that children in middle childhood may recognise objects without forming structural description representations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data visualization algorithms and feature selection techniques are both widely used in bioinformatics but as distinct analytical approaches. Until now there has been no method of measuring feature saliency while training a data visualization model. We derive a generative topographic mapping (GTM) based data visualization approach which estimates feature saliency simultaneously with the training of the visualization model. The approach not only provides a better projection by modeling irrelevant features with a separate noise model but also gives feature saliency values which help the user to assess the significance of each feature. We compare the quality of projection obtained using the new approach with the projections from traditional GTM and self-organizing maps (SOM) algorithms. The results obtained on a synthetic and a real-life chemoinformatics dataset demonstrate that the proposed approach successfully identifies feature significance and provides coherent (compact) projections. © 2006 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Theory suggests that people fear the unknown and no matter how experienced one is, the feelings of anxiety and uncertainty, if not managed well would affect how we view ourselves and how others view us. Hence, it is in human nature to engage in activities to help decipher behaviours that seem contrary to their beliefs and hinder the smooth-flowing of their work and daily activities. Building on these arguments, this research investigates the two types of support that are provided by multinational corporations (MNCs) and host country nationals (HCNs) to the expatriates and their family members whilst on international assignments in Malaysia as antecedents to their adjustment and performance in the host country. To complement the support provided, cultural intelligence (CQ) is investigated to explain the influence of cultural elements in facilitating adjustment and performance of the relocating families, especially to socially integrate into the host country. This research aims to investigate the influence of support and CQ on the adjustment and performance of expatriates in Malaysia. Path analyses are used to test the hypothesised relationships. The findings substantiate the pivotal roles that MNCs and HCNs play in helping the expatriates and their families acclimatise to the host country. This corroborates the norm of reciprocity where assistance or support rendered especially at the times when they were crucially needed would be reciprocated with positive behaviour deemed of equal value. Additionally, CQ is significantly positive in enhancing adjustment to the host country, which highlights the vital role that cultural awareness and knowledge play in enhancing effective intercultural communication and better execution of contextual performance. The research highlights the interdependence of the expatriates? multiple stakeholders (i.e. MNCs, HCNs, family members) in supporting the expatriates whilst on assignments. Finally, the findings reveal that the expatriate families do influence how the locals view the families and would be a great asset in initiating future communication between the expatriates and HCNs. The research contributes to the fields of intercultural adjustment and communication and also has key messages for policy makers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The investigation of insulation debris transport, sedimentation, penetration into the reactor core and head loss build up becomes important to reactor safety research for PWR and BWR, when considering the long-term behaviour of emergency core cooling systems during loss of coolant accidents. Research projects are being performed in cooperation between the University of Applied Sciences Zittau/Görlitz and the Helmholtz-Zentrum Dresden-Rossendorf. The projects include experimental investigations of different processes and phenomena of insulation debris in coolant flow and the development of CFD models. Generic complex experiments serve for building up a data base for the validation of models for single effects and their coupling in CFD codes. This paper includes the description of the experimental facility for complex generic experiments (ZSW), an overview about experimental boundary conditions and results for upstream and down-stream phenomena as well as for the long-time behaviour due to corrosive processes. © Carl Hanser Verlag, München.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visualization of high-dimensional data has always been a challenging task. Here we discuss and propose variants of non-linear data projection methods (Generative Topographic Mapping (GTM) and GTM with simultaneous feature saliency (GTM-FS)) that are adapted to be effective on very high-dimensional data. The adaptations use log space values at certain steps of the Expectation Maximization (EM) algorithm and during the visualization process. We have tested the proposed algorithms by visualizing electrostatic potential data for Major Histocompatibility Complex (MHC) class-I proteins. The experiments show that the variation in the original version of GTM and GTM-FS worked successfully with data of more than 2000 dimensions and we compare the results with other linear/nonlinear projection methods: Principal Component Analysis (PCA), Neuroscale (NSC) and Gaussian Process Latent Variable Model (GPLVM).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Levels of lignin and hydroxycinnamic acid wall components in three genera of forage grasses (Lolium,Festuca and Dactylis) have been accurately predicted by Fourier-transform infrared spectroscopy using partial least squares models correlated to analytical measurements. Different models were derived that predicted the concentrations of acid detergent lignin, total hydroxycinnamic acids, total ferulate monomers plus dimers, p-coumarate and ferulate dimers in independent spectral test data from methanol extracted samples of perennial forage grass with accuracies of 92.8%, 86.5%, 86.1%, 59.7% and 84.7% respectively, and analysis of model projection scores showed that the models relied generally on spectral features that are known absorptions of these compounds. Acid detergent lignin was predicted in samples of two species of energy grass, (Phalaris arundinacea and Pancium virgatum) with an accuracy of 84.5%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In some studies, the data are not measurements but comprise counts or frequencies of particular events. In such cases, an investigator may be interested in whether one specific event happens more frequently than another or whether an event occurs with a frequency predicted by a scientific model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PCA/FA is a method of analyzing complex data sets in which there are no clearly defined X or Y variables. It has multiple uses including the study of the pattern of variation between individual entities such as patients with particular disorders and the detailed study of descriptive variables. In most applications, variables are related to a smaller number of ‘factors’ or PCs that account for the maximum variance in the data and hence, may explain important trends among the variables. An increasingly important application of the method is in the ‘validation’ of questionnaires that attempt to relate subjective aspects of a patients experience with more objective measures of vision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is concerned with the inventory control of items that can be considered independent of one another. The decisions when to order and in what quantity, are the controllable or independent variables in cost expressions which are minimised. The four systems considered are referred to as (Q, R), (nQ,R,T), (M,T) and (M,R,T). Wiith ((Q,R) a fixed quantity Q is ordered each time the order cover (i.e. stock in hand plus on order ) equals or falls below R, the re-order level. With the other three systems reviews are made only at intervals of T. With (nQ,R,T) an order for nQ is placed if on review the inventory cover is less than or equal to R, where n, which is an integer, is chosen at the time so that the new order cover just exceeds R. In (M, T) each order increases the order cover to M. Fnally in (M, R, T) when on review, order cover does not exceed R, enough is ordered to increase it to M. The (Q, R) system is examined at several levels of complexity, so that the theoretical savings in inventory costs obtained with more exact models could be compared with the increases in computational costs. Since the exact model was preferable for the (Q,R) system only exact models were derived for theoretical systems for the other three. Several methods of optimization were tried, but most were found inappropriate for the exact models because of non-convergence. However one method did work for each of the exact models. Demand is considered continuous, and with one exception, the distribution assumed is the normal distribution truncated so that demand is never less than zero. Shortages are assumed to result in backorders, not lost sales. However, the shortage cost is a function of three items, one of which, the backorder cost, may be either a linear, quadratic or an exponential function of the length of time of a backorder, with or without period of grace. Lead times are assumed constant or gamma distributed. Lastly, the actual supply quantity is allowed to be distributed. All the sets of equations were programmed for a KDF 9 computer and the computed performances of the four inventory control procedures are compared under each assurnption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis was focused on theoretical models of synchronization to cortical dynamics as measured by magnetoencephalography (MEG). Dynamical systems theory was used in both identifying relevant variables for brain coordination and also in devising methods for their quantification. We presented a method for studying interactions of linear and chaotic neuronal sources using MEG beamforming techniques. We showed that such sources can be accurately reconstructed in terms of their location, temporal dynamics and possible interactions. Synchronization in low-dimensional nonlinear systems was studied to explore specific correlates of functional integration and segregation. In the case of interacting dissimilar systems, relevant coordination phenomena involved generalized and phase synchronization, which were often intermittent. Spatially-extended systems were then studied. For locally-coupled dissimilar systems, as in the case of cortical columns, clustering behaviour occurred. Synchronized clusters emerged at different frequencies and their boundaries were marked through oscillation death. The macroscopic mean field revealed sharp spectral peaks at the frequencies of the clusters and broader spectral drops at their boundaries. These results question existing models of Event Related Synchronization and Desynchronization. We re-examined the concept of the steady-state evoked response following an AM stimulus. We showed that very little variability in the AM following response could be accounted by system noise. We presented a methodology for detecting local and global nonlinear interactions from MEG data in order to account for residual variability. We found crosshemispheric nonlinear interactions of ongoing cortical rhythms concurrent with the stimulus and interactions of these rhythms with the following AM responses. Finally, we hypothesized that holistic spatial stimuli would be accompanied by the emergence of clusters in primary visual cortex resulting in frequency-specific MEG oscillations. Indeed, we found different frequency distributions in induced gamma oscillations for different spatial stimuli, which was suggestive of temporal coding of these spatial stimuli. Further, we addressed the bursting character of these oscillations, which was suggestive of intermittent nonlinear dynamics. However, we did not observe the characteristic-3/2 power-law scaling in the distribution of interburst intervals. Further, this distribution was only seldom significantly different to the one obtained in surrogate data, where nonlinear structure was destroyed. In conclusion, the work presented in this thesis suggests that advances in dynamical systems theory in conjunction with developments in magnetoencephalography may facilitate a mapping between levels of description int he brain. this may potentially represent a major advancement in neuroscience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces a new mathematical method for improving the discrimination power of data envelopment analysis and to completely rank the efficient decision-making units (DMUs). Fuzzy concept is utilised. For this purpose, first all DMUs are evaluated with the CCR model. Thereafter, the resulted weights for each output are considered as fuzzy sets and are then converted to fuzzy numbers. The introduced model is a multi-objective linear model, endpoints of which are the highest and lowest of the weighted values. An added advantage of the model is its ability to handle the infeasibility situation sometimes faced by previously introduced models.