945 resultados para mean field independent component analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel methodology for damage detection and location in structures is proposed. The methodology is based on strain measurements and consists in the development of strain field pattern recognition techniques. The aforementioned are based on PCA (principal component analysis) and damage indices (T 2 and Q). We propose the use of fiber Bragg gratings (FBGs) as strain sensors

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE Epilepsy is increasingly considered as the dysfunction of a pathologic neuronal network (epileptic network) rather than a single focal source. We aimed to assess the interactions between the regions that comprise the epileptic network and to investigate their dependence on the occurrence of interictal epileptiform discharges (IEDs). METHODS We analyzed resting state simultaneous electroencephalography-functional magnetic resonance imaging (EEG-fMRI) recordings in 10 patients with drug-resistant focal epilepsy with multifocal IED-related blood oxygen level-dependent (BOLD) responses and a maximum t-value in the IED field. We computed functional connectivity (FC) maps of the epileptic network using two types of seed: (1) a 10-mm diameter sphere centered in the global maximum of IED-related BOLD map, and (2) the independent component with highest correlation to the IED-related BOLD map, named epileptic component. For both approaches, we compared FC maps before and after regressing out the effect of IEDs in terms of maximum and mean t-values and percentage of map overlap. RESULTS Maximum and mean FC maps t-values were significantly lower after regressing out IEDs at the group level (p < 0.01). Overlap extent was 85% ± 12% and 87% ± 12% when the seed was the 10-mm diameter sphere and the epileptic component, respectively. SIGNIFICANCE Regions involved in a specific epileptic network show coherent BOLD fluctuations independent of scalp EEG IEDs. FC topography and strength is largely preserved by removing the IED effect. This could represent a signature of a sustained pathologic network with contribution from epileptic activity invisible to the scalp EEG.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Changes in surface water hydrography in the Southern Ocean (eastern Atlantic sector) could be reconstructed on the basis of isotope-geochemical and micropaleontological studies. A total of 75 high quality multicorer sediment surface samples from the southern South Atlantic Ocean and three Quaternary sediment cores, taken on a meridional transect across the Antarctic Circumpolar Current, have been investigated. The results of examining stable oxygen isotope compositions of 24 foraminiferal species and morphotypes were compared to the near-surface hydrography. The different foraminifera have been divided into four groups living at different depths in the upper water column. The 8180 differences between shallow-living (e.g. G. bulloides, N. pachyderma) and deeper-dwelling (e. g. G. inflata) species reflect the measured temperature gradient of the upper 250 m in the water column. Thus, the 6180 difference between shallow-living and deeper-living foraminifera can be used as an indicator for the vertical temperature gradient in the surface water of the Antarctic Circumpolar Current, which is independent of ice volume. All planktonic foraminifera in the surface sediment samples have been counted. 27 species and morphotypes have been selected, to form a reference data Set for statistical purposes. By using R- and Q-mode principal component analysis these planktonic foraminifera have been divided into four and five assemblages, respectively. The geographic distribution of these assemblages is mainly linked to the temperature of sea-surface waters. The five assemblages (factors) of the Q-mode principal component analysis account for 97.l % of the variance of original data. Following the transferfunction- technique a multiple regression between the Q-mode factors and the actual mean sea-surface environmental parameters resulted in a set of equations. The new transfer function can be used to estimate past sea-surface seasonal temperatures for paleoassemblages of planktonic foraminifera with a precision of approximately ±1.2°C. This transfer function F75-27-5 encompasses in particular the environmental conditions in the Atlantic sector of the Antarctic Circumpolar Current. During the last 140,000 years reconstructed sea-surface temperatures fluctuated in the present northern Subantarctic Zone (PS2076-1/3) at an amplitude of up to 7.5°C in summer and of up to 8.5°C in winter. In the present Polarfrontal Zone (PS1754-1) these fluctuations between glacials and interglacials show lower temperatures from 2.5 to 8.5°C in summer and from 1.0 to 5.0°C in winter, respectively. Compared to today, calculated oxygen isotope temperature gradients in the present Subantarctic Zone were lower during the last 140,000 years. This is an indicator for a good mixing of the upper water column. In the Polarfrontal Zone also lower oxygen isotope temperature gradients were found for the glacials 6, 4 and 2. But almost similar temperature gradients as today were found during the interglacial stages 5, 3 and the Holocene, which implicates a mixing of the upper water column compared to present. Paleosalinities were reconstructed by combining d18O-data and the evaluated transfer function paleotemperatures. Especially in the present Polarfrontal Zone (PS1754-1) and in the Antarctic Zone (PS1768-8), a short-term reduction of salinity up to 4 %o, could be detected. This significant reduction in sea-surface water salinity indicates the increased influx of melt-water at the beginning of deglaciation in the southern hemisphere at the end of the last glacial, approximately 16,500-13,000 years ago. The reconstruction of environmental Parameters indicates only small changes in the position of the frontal Systems in the eastern sector of the Antarctic Circumpolar Current during the last 140,000 years. The average position of the Subtropical Front and Subantarctic Front shifted approximately three latitudes between interglacials and glacials. The Antarctic Polar Front shifted approximately four latitudes. But substantial modifications of this scenario have been interpreted for the reconstruction of cold sea-surface temperatures at 41Â S during the oxygen isotope stages 16 and 14 to 12. During these times the Subtropical Front was probably shified up to seven latitudes northwards.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Adsorption of pure nitrogen, argon, acetone, chloroform and acetone-chloroform mixture on graphitized thermal carbon black is considered at sub-critical conditions by means of molecular layer structure theory (MLST). In the present version of the MLST an adsorbed fluid is considered as a sequence of 2D molecular layers, whose Helmholtz free energies are obtained directly from the analysis of experimental adsorption isotherm of pure components. The interaction of the nearest layers is accounted for in the framework of mean field approximation. This approach allows quantitative correlating of experimental nitrogen and argon adsorption isotherm both in the monolayer region and in the range of multi-layer coverage up to 10 molecular layers. In the case of acetone and chloroform the approach also leads to excellent quantitative correlation of adsorption isotherms, while molecular approaches such as the non-local density functional theory (NLDFT) fail to describe those isotherms. We extend our new method to calculate the Helmholtz free energy of an adsorbed mixture using a simple mixing rule, and this allows us to predict mixture adsorption isotherms from pure component adsorption isotherms. The approach, which accounts for the difference in composition in different molecular layers, is tested against the experimental data of acetone-chloroform mixture (non-ideal mixture) adsorption on graphitized thermal carbon black at 50 degrees C. (C) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Guest editorial Ali Emrouznejad is a Senior Lecturer at the Aston Business School in Birmingham, UK. His areas of research interest include performance measurement and management, efficiency and productivity analysis as well as data mining. He has published widely in various international journals. He is an Associate Editor of IMA Journal of Management Mathematics and Guest Editor to several special issues of journals including Journal of Operational Research Society, Annals of Operations Research, Journal of Medical Systems, and International Journal of Energy Management Sector. He is in the editorial board of several international journals and co-founder of Performance Improvement Management Software. William Ho is a Senior Lecturer at the Aston University Business School. Before joining Aston in 2005, he had worked as a Research Associate in the Department of Industrial and Systems Engineering at the Hong Kong Polytechnic University. His research interests include supply chain management, production and operations management, and operations research. He has published extensively in various international journals like Computers & Operations Research, Engineering Applications of Artificial Intelligence, European Journal of Operational Research, Expert Systems with Applications, International Journal of Production Economics, International Journal of Production Research, Supply Chain Management: An International Journal, and so on. His first authored book was published in 2006. He is an Editorial Board member of the International Journal of Advanced Manufacturing Technology and an Associate Editor of the OR Insight Journal. Currently, he is a Scholar of the Advanced Institute of Management Research. Uses of frontier efficiency methodologies and multi-criteria decision making for performance measurement in the energy sector This special issue aims to focus on holistic, applied research on performance measurement in energy sector management and for publication of relevant applied research to bridge the gap between industry and academia. After a rigorous refereeing process, seven papers were included in this special issue. The volume opens with five data envelopment analysis (DEA)-based papers. Wu et al. apply the DEA-based Malmquist index to evaluate the changes in relative efficiency and the total factor productivity of coal-fired electricity generation of 30 Chinese administrative regions from 1999 to 2007. Factors considered in the model include fuel consumption, labor, capital, sulphur dioxide emissions, and electricity generated. The authors reveal that the east provinces were relatively and technically more efficient, whereas the west provinces had the highest growth rate in the period studied. Ioannis E. Tsolas applies the DEA approach to assess the performance of Greek fossil fuel-fired power stations taking undesirable outputs into consideration, such as carbon dioxide and sulphur dioxide emissions. In addition, the bootstrapping approach is deployed to address the uncertainty surrounding DEA point estimates, and provide bias-corrected estimations and confidence intervals for the point estimates. The author revealed from the sample that the non-lignite-fired stations are on an average more efficient than the lignite-fired stations. Maethee Mekaroonreung and Andrew L. Johnson compare the relative performance of three DEA-based measures, which estimate production frontiers and evaluate the relative efficiency of 113 US petroleum refineries while considering undesirable outputs. Three inputs (capital, energy consumption, and crude oil consumption), two desirable outputs (gasoline and distillate generation), and an undesirable output (toxic release) are considered in the DEA models. The authors discover that refineries in the Rocky Mountain region performed the best, and about 60 percent of oil refineries in the sample could improve their efficiencies further. H. Omrani, A. Azadeh, S. F. Ghaderi, and S. Abdollahzadeh presented an integrated approach, combining DEA, corrected ordinary least squares (COLS), and principal component analysis (PCA) methods, to calculate the relative efficiency scores of 26 Iranian electricity distribution units from 2003 to 2006. Specifically, both DEA and COLS are used to check three internal consistency conditions, whereas PCA is used to verify and validate the final ranking results of either DEA (consistency) or DEA-COLS (non-consistency). Three inputs (network length, transformer capacity, and number of employees) and two outputs (number of customers and total electricity sales) are considered in the model. Virendra Ajodhia applied three DEA-based models to evaluate the relative performance of 20 electricity distribution firms from the UK and the Netherlands. The first model is a traditional DEA model for analyzing cost-only efficiency. The second model includes (inverse) quality by modelling total customer minutes lost as an input data. The third model is based on the idea of using total social costs, including the firm’s private costs and the interruption costs incurred by consumers, as an input. Both energy-delivered and number of consumers are treated as the outputs in the models. After five DEA papers, Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou, and D. Zevgolis presented a multiple criteria analysis weighting approach to evaluate the energy and climate policy. The proposed approach is akin to the analytic hierarchy process, which consists of pairwise comparisons, consistency verification, and criteria prioritization. In the approach, stakeholders and experts in the energy policy field are incorporated in the evaluation process by providing an interactive mean with verbal, numerical, and visual representation of their preferences. A total of 14 evaluation criteria were considered and classified into four objectives, such as climate change mitigation, energy effectiveness, socioeconomic, and competitiveness and technology. Finally, Borge Hess applied the stochastic frontier analysis approach to analyze the impact of various business strategies, including acquisition, holding structures, and joint ventures, on a firm’s efficiency within a sample of 47 natural gas transmission pipelines in the USA from 1996 to 2005. The author finds that there were no significant changes in the firm’s efficiency by an acquisition, and there is a weak evidence for efficiency improvements caused by the new shareholder. Besides, the author discovers that parent companies appear not to influence a subsidiary’s efficiency positively. In addition, the analysis shows a negative impact of a joint venture on technical efficiency of the pipeline company. To conclude, we are grateful to all the authors for their contribution, and all the reviewers for their constructive comments, which made this special issue possible. We hope that this issue would contribute significantly to performance improvement of the energy sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Glucagon-like peptide-1 (GLP-1) receptor agonists improve islet function and delay gastric emptying in patients with type 2 diabetes mellitus (T2DM). This meta-analysis aimed to investigate the effects of the once-daily prandial GLP-1 receptor agonist lixisenatide on postprandial plasma glucose (PPG), glucagon and insulin levels. Methods: Six randomized, placebo-controlled studies of lixisenatide 20μg once daily were included in this analysis: lixisenatide as monotherapy (GetGoal-Mono), as add-on to oral antidiabetic drugs (OADs; GetGoal-M, GetGoal-S) or in combination with basal insulin (GetGoal-L, GetGoal-Duo-1 and GetGoal-L-Asia). Change in 2-h PPG and glucose excursion were evaluated across six studies. Change in 2-h glucagon and postprandial insulin were evaluated across two studies. A meta-analysis was performed on least square (LS) mean estimates obtained from analysis of covariance (ANCOVA)-based linear regression. Results: Lixisenatide significantly reduced 2-h PPG from baseline (LS mean difference vs. placebo: -4.9mmol/l, p<0.001) and glucose excursion (LS mean difference vs. placebo: -4.5mmol/l, p<0.001). As measured in two studies, lixisenatide also reduced postprandial glucagon (LS mean difference vs. placebo: -19.0ng/l, p<0.001) and insulin (LS mean difference vs. placebo: -64.8 pmol/l, p<0.001). There was a stronger correlation between 2-h postprandial glucagon and 2-h PPG with lixisenatide than with placebo. Conclusions: Lixisenatide significantly reduced 2-h PPG and glucose excursion together with a marked reduction in postprandial glucagon and insulin; thus, lixisenatide appears to have biological effects on blood glucose that are independent of increased insulin secretion. These effects may be, in part, attributed to reduced glucagon secretion. © 2014 John Wiley and Sons Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based on dynamic renormalization group techniques, this letter analyzes the effects of external stochastic perturbations on the dynamical properties of cholesteric liquid crystals, studied in presence of a random magnetic field. Our analysis quantifies the nature of the temperature dependence of the dynamics; the results also highlight a hitherto unexplored regime in cholesteric liquid crystal dynamics. We show that stochastic fluctuations drive the system to a second-ordered Kosterlitz-Thouless phase transition point, eventually leading to a Kardar-Parisi-Zhang (KPZ) universality class. The results go beyond quasi-first order mean-field theories, and provides the first theoretical understanding of a KPZ phase in distorted nematic liquid crystal dynamics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigated controls on the water chemistry of a South Ecuadorian cloud forest catchment which is partly pristine, and partly converted to extensive pasture. From April 2007 to May 2008 water samples were taken weekly to biweekly at nine different subcatchments, and were screened for differences in electric conductivity, pH, anion, as well as element composition. A principal component analysis was conducted to reduce dimensionality of the data set and define major factors explaining variation in the data. Three main factors were isolated by a subset of 10 elements (Ca2+, Ce, Gd, K+, Mg2+, Na+, Nd, Rb, Sr, Y), explaining around 90% of the data variation. Land-use was the major factor controlling and changing water chemistry of the subcatchments. A second factor was associated with the concentration of rare earth elements in water, presumably highlighting other anthropogenic influences such as gravel excavation or road construction. Around 12% of the variation was explained by the third component, which was defined by the occurrence of Rb and K and represents the influence of vegetation dynamics on element accumulation and wash-out. Comparison of base- and fast flow concentrations led to the assumption that a significant portion of soil water from around 30 cm depth contributes to storm flow, as revealed by increased rare earth element concentrations in fast flow samples. Our findings demonstrate the utility of multi-tracer principal component analysis to study tropical headwater streams, and emphasize the need for effective land management in cloud forest catchments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines the performance of series of two geomagnetic indices and series synthesized from a semi-empirical model of magnetospheric currents, in explaining the geomagnetic activity observed at Northern Hemipshere's mid-latitude ground-based stations. We analyse data, for the 2007 to 2014 period, from four magnetic observatories (Coimbra, Portugal; Panagyurishte, Bulgary; Novosibirsk, Russia and Boulder, USA), at geomagnetic latitudes between 40° and 50° N. The quiet daily (QD) variation is firstly removed from the time series of the geomagnetic horizontal component (H) using natural orthogonal components (NOC) tools. We compare the resulting series with series of storm-time disturbance (Dst) and ring current (RC) indices and with H series synthesized from the Tsyganenko and Sitnov (2005, doi:10.1029/2004JA010798) (TS05) semi-empirical model of storm-time geomagnetic field. In the analysis, we separate days with low and high local K-index values. Our results show that NOC models are as efficient as standard models of QD variation in preparing raw data to be compared with proxies, but with much less complexity. For the two stations in Europe, we obtain indication that NOC models could be able to separate ionospheric and magnetospheric contributions. Dst and RC series explain the four observatory H-series successfully, with values for the mean of significant correlation coefficients, from 0.5 to 0.6 during low geomagnetic activity (K less than 4) and from 0.6 to 0.7 for geomagnetic active days (K greater than or equal to 4). With regard to the performance of TS05, our results show that the four observatories separate into two groups: Coimbra and Panagyurishte, in one group, for which the magnetospheric/ionospheric ratio in QD variation is smaller, a dominantly QD ionospheric contribution can be removed and TS05 simulations are the best proxy; Boulder and Novosibirsk,in the other group, for which the ionospheric and magnetospheric contributions in QD variation can not be differentiated and correlations with TS05 series can not be made to improve. The main contributor to magnetospheric QD signal are Birkeland currents. The relatively good success of TS05 model in explaining ground-based irregular geomagnetic activity at mid-latitudes makes it an effective tool to classify storms according to their main sources. For Coimbra and Panagyurishte in particular, where ionospheric and magnetospheric daily contributions seem easier to separate, we can aspire to use the TS05 model for ensemble generation in space weather (SW) forecasting and interpretation of past SW events.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we consider a class of scalar integral equations with a form of space-dependent delay. These non-local models arise naturally when modelling neural tissue with active axons and passive dendrites. Such systems are known to support a dynamic (oscillatory) Turing instability of the homogeneous steady state. In this paper we develop a weakly nonlinear analysis of the travelling and standing waves that form beyond the point of instability. The appropriate amplitude equations are found to be the coupled mean-field Ginzburg-Landau equations describing a Turing-Hopf bifurcation with modulation group velocity of O(1). Importantly we are able to obtain the coefficients of terms in the amplitude equations in terms of integral transforms of the spatio-temporal kernels defining the neural field equation of interest. Indeed our results cover not only models with axonal or dendritic delays but those which are described by a more general distribution of delayed spatio-temporal interactions. We illustrate the predictive power of this form of analysis with comparison against direct numerical simulations, paying particular attention to the competition between standing and travelling waves and the onset of Benjamin-Feir instabilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-throughput screening of physical, genetic and chemical-genetic interactions brings important perspectives in the Systems Biology field, as the analysis of these interactions provides new insights into protein/gene function, cellular metabolic variations and the validation of therapeutic targets and drug design. However, such analysis depends on a pipeline connecting different tools that can automatically integrate data from diverse sources and result in a more comprehensive dataset that can be properly interpreted. We describe here the Integrated Interactome System (IIS), an integrative platform with a web-based interface for the annotation, analysis and visualization of the interaction profiles of proteins/genes, metabolites and drugs of interest. IIS works in four connected modules: (i) Submission module, which receives raw data derived from Sanger sequencing (e.g. two-hybrid system); (ii) Search module, which enables the user to search for the processed reads to be assembled into contigs/singlets, or for lists of proteins/genes, metabolites and drugs of interest, and add them to the project; (iii) Annotation module, which assigns annotations from several databases for the contigs/singlets or lists of proteins/genes, generating tables with automatic annotation that can be manually curated; and (iv) Interactome module, which maps the contigs/singlets or the uploaded lists to entries in our integrated database, building networks that gather novel identified interactions, protein and metabolite expression/concentration levels, subcellular localization and computed topological metrics, GO biological processes and KEGG pathways enrichment. This module generates a XGMML file that can be imported into Cytoscape or be visualized directly on the web. We have developed IIS by the integration of diverse databases following the need of appropriate tools for a systematic analysis of physical, genetic and chemical-genetic interactions. IIS was validated with yeast two-hybrid, proteomics and metabolomics datasets, but it is also extendable to other datasets. IIS is freely available online at: http://www.lge.ibi.unicamp.br/lnbio/IIS/.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Networks of Kuramoto oscillators with a positive correlation between the oscillators frequencies and the degree of their corresponding vertices exhibit so-called explosive synchronization behavior, which is now under intensive investigation. Here we study and discuss explosive synchronization in a situation that has not yet been considered, namely when only a part, typically a small part, of the vertices is subjected to a degree-frequency correlation. Our results show that in order to have explosive synchronization, it suffices to have degree-frequency correlations only for the hubs, the vertices with the highest degrees. Moreover, we show that a partial degree-frequency correlation does not only promotes but also allows explosive synchronization to happen in networks for which a full degree-frequency correlation would not allow it. We perform a mean-field analysis and our conclusions were corroborated by exhaustive numerical experiments for synthetic networks and also for the undirected and unweighed version of a typical benchmark biological network, namely the neural network of the worm Caenorhabditis elegans. The latter is an explicit example where partial degree-frequency correlation leads to explosive synchronization with hysteresis, in contrast with the fully correlated case, for which no explosive synchronization is observed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The establishment of the most stable structures of eight membered rings is a challenging task to the field of conformational analysis. In this work, a series of 2-halocyclooctanones were synthesized (including fluorine, chlorine, bromine and iodine derivatives) and submitted to conformational studies using a combination of theoretical calculation and infrared spectroscopy. For each compound, four conformations were identified as the most important ones. These conformations are derived from the chair-boat conformation of cyclooctanone. The pseudo-equatorial (with respect to the halogen) conformer is preferred in vacuum and in low polarity solvents for chlorine, bromine and iodine derivatives. For 2-fluorocyclooctanone, the preferred conformation in vacuum is pseudo-axial. In acetonitrile, the pseudo-axial conformer becomes the most stable for the chlorine derivative. According to NBO calculations, the conformational preference is not dictated by electron delocalization, but by classical electrostatic repulsions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conventional reflectance spectroscopy (NIRS) and hyperspectral imaging (HI) in the near-infrared region (1000-2500 nm) are evaluated and compared, using, as the case study, the determination of relevant properties related to the quality of natural rubber. Mooney viscosity (MV) and plasticity indices (PI) (PI0 - original plasticity, PI30 - plasticity after accelerated aging, and PRI - the plasticity retention index after accelerated aging) of rubber were determined using multivariate regression models. Two hundred and eighty six samples of rubber were measured using conventional and hyperspectral near-infrared imaging reflectance instruments in the range of 1000-2500 nm. The sample set was split into regression (n = 191) and external validation (n = 95) sub-sets. Three instruments were employed for data acquisition: a line scanning hyperspectral camera and two conventional FT-NIR spectrometers. Sample heterogeneity was evaluated using hyperspectral images obtained with a resolution of 150 × 150 μm and principal component analysis. The probed sample area (5 cm(2); 24,000 pixels) to achieve representativeness was found to be equivalent to the average of 6 spectra for a 1 cm diameter probing circular window of one FT-NIR instrument. The other spectrophotometer can probe the whole sample in only one measurement. The results show that the rubber properties can be determined with very similar accuracy and precision by Partial Least Square (PLS) regression models regardless of whether HI-NIR or conventional FT-NIR produce the spectral datasets. The best Root Mean Square Errors of Prediction (RMSEPs) of external validation for MV, PI0, PI30, and PRI were 4.3, 1.8, 3.4, and 5.3%, respectively. Though the quantitative results provided by the three instruments can be considered equivalent, the hyperspectral imaging instrument presents a number of advantages, being about 6 times faster than conventional bulk spectrometers, producing robust spectral data by ensuring sample representativeness, and minimizing the effect of the presence of contaminants.