79 resultados para multi-factor models
Resumo:
This paper examines the ability of the doubly fed induction generator (DFIG) to deliver multiple reactive power objectives during variable wind conditions. The reactive power requirement is decomposed based on various control objectives (e.g. power factor control, voltage control, loss minimisation, and flicker mitigation) defined around different time frames (i.e. seconds, minutes, and hourly), and the control reference is generated by aggregating the individual reactive power requirement for each control strategy. A novel coordinated controller is implemented for the rotor-side converter and the grid-side converter considering their capability curves and illustrating that it can effectively utilise the aggregated DFIG reactive power capability for system performance enhancement. The performance of the multi-objective strategy is examined for a range of wind and network conditions, and it is shown that for the majority of the scenarios, more than 92% of the main control objective can be achieved while introducing the integrated flicker control scheme with the main reactive power control scheme. Therefore, optimal control coordination across the different control strategies can maximise the availability of ancillary services from DFIG-based wind farms without additional dynamic reactive power devices being installed in power networks.
Resumo:
Aim: To determine if serum pigment epithelium-derived factor (PEDF) levels in Type 2 diabetes are related to vascular risk factors and renal function. Methods: PEDF was quantified by ELISA in a cross-sectional study of 857 male Veterans Affairs Diabetes Trial (VADT) subjects, and associations with cardiovascular risk factors and renal function were determined. In a subset (n = 246) in whom serum was obtained early in the VADT (2.0 ± 0.3 years post-randomization), PEDF was related to longitudinal changes in renal function over 3.1 years. Results: Cross-sectional study: In multivariate regression models, PEDF was positively associated with serum triglycerides, waist-to-hip ratio, serum creatinine, use of ACE inhibitors or angiotensin receptor blockers, and use of lipid-lowering agents; it was negatively associated with HDL-C (all p < 0.05). Longitudinal study: PEDF was not associated with changes in renal function over 3.1 years (p > 0.09). Conclusions: Serum PEDF in Type 2 diabetic men was cross-sectionally associated with dyslipidemia, body habitus, use of common drugs for blood pressure and dyslipidemia, and indices of renal function; however, PEDF was not associated with renal decline over 3.1 years.
Resumo:
This paper investigates the construction of linear-in-the-parameters (LITP) models for multi-output regression problems. Most existing stepwise forward algorithms choose the regressor terms one by one, each time maximizing the model error reduction ratio. The drawback is that such procedures cannot guarantee a sparse model, especially under highly noisy learning conditions. The main objective of this paper is to improve the sparsity and generalization capability of a model for multi-output regression problems, while reducing the computational complexity. This is achieved by proposing a novel multi-output two-stage locally regularized model construction (MTLRMC) method using the extreme learning machine (ELM). In this new algorithm, the nonlinear parameters in each term, such as the width of the Gaussian function and the power of a polynomial term, are firstly determined by the ELM. An initial multi-output LITP model is then generated according to the termination criteria in the first stage. The significance of each selected regressor is checked and the insignificant ones are replaced at the second stage. The proposed method can produce an optimized compact model by using the regularized parameters. Further, to reduce the computational complexity, a proper regression context is used to allow fast implementation of the proposed method. Simulation results confirm the effectiveness of the proposed technique. © 2013 Elsevier B.V.
Resumo:
Rationale: Increasing epithelial repair and regeneration may hasten resolution of lung injury in patients with the Acute Respiratory Distress Syndrome (ARDS). In animal models of ARDS, Keratinocyte Growth Factor (KGF) reduces injury and increases epithelial proliferation and repair. The effect of KGF in the human alveolus is unknown.
Objectives: To test whether KGF can attenuate alveolar injury in a human model of ARDS.
Methods: Volunteers were randomized to intravenous KGF (60 μg/kg) or placebo for 3 days, before inhaling 50μg lipopolysaccharide. Six hours later, subjects underwent bronchoalveolar lavage (BAL) to quantify markers of alveolar inflammation and cell-specific injury.
Measurements and Main Results: KGF did not alter leukocyte infiltration or markers of permeability in response to LPS. KGF increased BAL concentrations of Surfactant Protein D (SP-D), MMP-9, IL-1Ra, GM-CSF and CRP. In vitro, BAL fluid from KGF-treated subjects (KGF BAL) inhibited pulmonary fibroblast proliferation, but increased alveolar epithelial proliferation. Active MMP-9 increased alveolar epithelial wound repair. Finally, BAL from the KGF pre-treated group enhanced macrophage phagocytic uptake of apoptotic epithelial cells and bacteria compared with BAL from the placebo-treated group. This effect was blocked by inhibiting activation of the GM-CSF receptor.
Conclusions: KGF treatment increases BAL SP-D, a marker of type II alveolar epithelial cell proliferation in a human model of ALI. Additionally KGF increases alveolar concentrations of the anti-inflammatory cytokine IL-1Ra, and mediators that drive epithelial repair (MMP-9) and enhance macrophage clearance of dead cells and bacteria (GM-CSF).
Resumo:
Accurate conceptual models of groundwater systems are essential for correct interpretation of monitoring data in catchment studies. In surface-water dominated hard rock regions, modern ground and surface water monitoring programmes often have very high resolution chemical, meteorological and hydrological observations but lack an equivalent emphasis on the subsurface environment, the properties of which exert a strong control on flow pathways and interactions with surface waters. The reasons for this disparity are the complexity of the system and the difficulty in accurately characterising the subsurface, except locally at outcrops or in boreholes. This is particularly the case in maritime north-western Europe, where a legacy of glacial activity, combined with large areas underlain by heterogeneous igneous and metamorphic bedrock, make the structure and weathering of bedrock difficult to map or model. Traditional approaches which seek to extrapolate information from borehole to field-scale are of limited application in these environments due to the high degree of spatial heterogeneity. Here we apply an integrative and multi-scale approach, optimising and combining standard geophysical techniques to generate a three-dimensional geological conceptual model of the subsurface in a catchment in NE Ireland. Available airborne LiDAR, electromagnetic and magnetic data sets were analysed for the region. At field-scale surface geophysical methods, including electrical resistivity tomography, seismic refraction, ground penetrating radar and magnetic surveys, were used and combined with field mapping of outcrops and borehole testing. The study demonstrates how combined interpretation of multiple methods at a range of scales produces robust three-dimensional conceptual models and a stronger basis for interpreting groundwater and surface water monitoring data.
Resumo:
This paper introduces hybrid address spaces as a fundamental design methodology for implementing scalable runtime systems on many-core architectures without hardware support for cache coherence. We use hybrid address spaces for an implementation of MapReduce, a programming model for large-scale data processing, and the implementation of a remote memory access (RMA) model. Both implementations are available on the Intel SCC and are portable to similar architectures. We present the design and implementation of HyMR, a MapReduce runtime system whereby different stages and the synchronization operations between them alternate between a distributed memory address space and a shared memory address space, to improve performance and scalability. We compare HyMR to a reference implementation and we find that HyMR improves performance by a factor of 1.71× over a set of representative MapReduce benchmarks. We also compare HyMR with Phoenix++, a state-of-art implementation for systems with hardware-managed cache coherence in terms of scalability and sustained to peak data processing bandwidth, where HyMR demon- strates improvements of a factor of 3.1× and 3.2× respectively. We further evaluate our hybrid remote memory access (HyRMA) programming model and assess its performance to be superior of that of message passing.
Resumo:
BACKGROUND: Experimental autoimmune encephalomyelitis (EAE) is an animal model of autoimmune inflammatory demyelination that is mediated by Th1 and Th17 cells. The transcription factor interferon regulatory factor 3 (IRF3) is activated by pathogen recognition receptors and induces interferon-beta production.
METHODS: To determine the role of IRF3 in autoimmune inflammation, we immunised wild-type (WT) and irf3-/- mice to induce EAE. Splenocytes from WT and irf3-/- mice were also activated in vitro in Th17-polarising conditions.
RESULTS: Clinical signs of disease were significantly lower in mice lacking IRF3, with reduced Th1 and Th17 cells in the central nervous system. Peripheral T-cell responses were also diminished, including impaired proliferation and Th17 development in irf3-/- mice. Myelin-reactive CD4+ cells lacking IRF3 completely failed to transfer EAE in Th17-polarised models as did WT cells transferred into irf3-/- recipients. Furthermore, IRF3 deficiency in non-CD4+ cells conferred impairment of Th17 development in antigen-activated cultures.
CONCLUSION: These data show that IRF3 plays a crucial role in development of Th17 responses and EAE and warrants investigation in human multiple sclerosis.
Resumo:
Traditionally, the optimization of a turbomachinery engine casing for tip clearance has involved either twodimensional transient thermomechanical simulations or three-dimensional mechanical simulations. This paper illustrates that three-dimensional transient whole-engine thermomechanical simulations can be used within tip clearance optimizations and that the efficiency of such optimizations can be improved when a multifidelity surrogate modeling approach is employed. These simulations are employed in conjunction with a rotor suboptimization using surrogate models of rotor-dynamics performance, stress, mass and transient displacements, and an engine parameterization.
Resumo:
In this paper, a multi-level wordline driver scheme is presented to improve 6T-SRAM read and write stability. The proposed wordline driver generates a shaped pulse during the read mode and a boosted wordline during the write mode. During read, the shaped pulse is tuned at nominal voltage for a short period of time, whereas for the remaining access time, the wordline voltage is reduced to save the power consumption of the cell. This shaped wordline pulse results in improved read noise margin without any degradation in access time for small wordline load. The improvement is explained by examining the dynamic and nonlinear behavior of the SRAM cell. Furthermore, during the hold mode, for a short time (depending on the size of boosting capacitance), wordline voltage becomes negative and charges up to zero after a specific time that results in a lower leakage current compared to conventional SRAM. The proposed technique results in at least 2× improvement in read noise margin while it improves write margin by 3× for lower supply voltages than 0.7 V. The leakage power for the proposed SRAM is reduced by 2% while the total power is improved by 3% in the worst case scenario for an SRAM array. The main advantage of the proposed wordline driver is the improvement of dynamic noise margin with less than 2.5% penalty in area. TSMC 65 nm technology models are used for simulations.
Resumo:
Semiconductor fabrication involves several sequential processing steps with the result that critical production variables are often affected by a superposition of affects over multiple steps. In this paper a Virtual Metrology (VM) system for early stage measurement of such variables is presented; the VM system seeks to express the contribution to the output variability that is due to a defined observable part of the production line. The outputs of the processed system may be used for process monitoring and control purposes. A second contribution of this work is the introduction of Elastic Nets, a regularization and variable selection technique for the modelling of highly-correlated datasets, as a technique for the development of VM models. Elastic Nets and the proposed VM system are illustrated using real data from a multi-stage etch process used in the fabrication of disk drive read/write heads. © 2013 IEEE.
Resumo:
Thermal comfort is defined as “that condition of mind which expresses satisfaction with the thermal environment’ [1] [2]. Field studies have been completed in order to establish the governing conditions for thermal comfort [3]. These studies showed that the internal climate of a room was the strongest factor in establishing thermal comfort. Direct manipulation of the internal climate is necessary to retain an acceptable level of thermal comfort. In order for Building Energy Management Systems (BEMS) strategies to be efficiently utilised it is necessary to have the ability to predict the effect that activating a heating/cooling source (radiators, windows and doors) will have on the room. The numerical modelling of the domain can be challenging due to necessity to capture temperature stratification and/or different heat sources (radiators, computers and human beings). Computational Fluid Dynamic (CFD) models are usually utilised for this function because they provide the level of details required. Although they provide the necessary level of accuracy these models tend to be highly computationally expensive especially when transient behaviour needs to be analysed. Consequently they cannot be integrated in BEMS. This paper presents and describes validation of a CFD-ROM method for real-time simulations of building thermal performance. The CFD-ROM method involves the automatic extraction and solution of reduced order models (ROMs) from validated CFD simulations. The test case used in this work is a room of the Environmental Research Institute (ERI) Building at the University College Cork (UCC). ROMs have shown that they are sufficiently accurate with a total error of less than 1% and successfully retain a satisfactory representation of the phenomena modelled. The number of zones in a ROM defines the size and complexity of that ROM. It has been observed that ROMs with a higher number of zones produce more accurate results. As each ROM has a time to solution of less than 20 seconds they can be integrated into the BEMS of a building which opens the potential to real time physics based building energy modelling.
Resumo:
Climate change during the last five decades has impacted significantly on natural ecosystems and the rate of current climate change is of great concern among conservation biologists. Species Distribution Models (SDMs) have been used widely to project changes in species’ bioclimatic envelopes under future climate scenarios. Here, we aimed to advance this technique by assessing future changes in the bioclimatic envelopes of an entire mammalian order, the Lagomorpha, using a novel framework for model validation based jointly on subjective expert evaluation and objective model evaluation statistics. SDMs were built using climatic, topographical and habitat variables for all 87 lagomorph species under past and current climate scenarios. Expert evaluation and Kappa values were used to validate past and current models and only those deemed ‘modellable’ within our framework were projected under future climate scenarios (58 species). Phylogenetically-controlled regressions were used to test whether species traits correlated with predicted responses to climate change. Climate change is likely to impact more than two-thirds of lagomorph species, with leporids (rabbits, hares and jackrabbits) likely to undertake poleward shifts with little overall change in range extent, whilst pikas are likely to show extreme shifts to higher altitudes associated with marked range declines, including the likely extinction of Kozlov’s Pika (Ochotona koslowi). Smaller-bodied species were more likely to exhibit range contractions and elevational increases, but showing little poleward movement, and fecund species were more likely to shift latitudinally and elevationally. Our results suggest that species traits may be important indicators of future climate change and we believe multi-species approaches, as demonstrated here, are likely to lead to more effective mitigation measures and conservation management. We strongly advocate studies minimising data gaps in our knowledge of the Order, specifically collecting more specimens for biodiversity archives and targeting data deficient geographic regions.
Resumo:
This paper highlights the crucial role played by party-specific responsibility attributions in performance-based voting. Three models of electoral accountability, which make distinct assumptions regarding citizens' ability to attribute responsibility to distinct governing parties, are tested in the challenging Northern Ireland context - an exemplar case of multi-level multi-party government in which expectations of performance based voting are low. The paper demonstrates the operation of party-attribution based electoral accountability, using data from the 2011 Northern Ireland Assembly Election Study. However, the findings are asymmetric: accountability operates in the Protestant/unionist bloc but not in the Catholic/nationalist bloc. This asymmetry may be explained by the absence of clear ethno-national ideological distinctions between the unionist parties (hence providing political space for performance based accountability to operate) but the continued relevance in the nationalist bloc of ethno-national difference (which limits the scope for performance politics). The implications of the findings for our understanding of the role of party-specific responsibility attribution in performance based models of voting, and for our evaluation of the quality of democracy in post-conflict consociational polities, are discussed.
Resumo:
Sonoluminescence (SL) involves the conversion of mechanical [ultra]sound energy into light. Whilst the phenomenon is invariably inefficient, typically converting just 10-4 of the incident acoustic energy into photons, it is nonetheless extraordinary, as the resultant energy density of the emergent photons exceeds that of the ultrasonic driving field by a factor of some 10 12. Sonoluminescence has specific [as yet untapped] advantages in that it can be effected at remote locations in an essentially wireless format. The only [usual] requirement is energy transduction via the violent oscillation of microscopic bubbles within the propagating medium. The dependence of sonoluminescent output on the generating sound field's parameters, such as pulse duration, duty cycle, and position within the field, have been observed and measured previously, and several relevant aspects are discussed presently. We also extrapolate the logic from a recently published analysis relating to the ensuing dynamics of bubble 'clouds' that have been stimulated by ultrasound. Here, the intention was to develop a relevant [yet computationally simplistic] model that captured the essential physical qualities expected from real sonoluminescent microbubble clouds. We focused on the inferred temporal characteristics of SL light output from a population of such bubbles, subjected to intermediate [0.5-2MPa] ultrasonic pressures. Finally, whilst direct applications for sonoluminescent light output are thought unlikely in the main, we proceed to frame the state-of-the- art against several presently existing technologies that could form adjunct approaches with distinct potential for enhancing present sonoluminescent light output that may prove useful in real world [biomedical] applications.
Resumo:
This paper presents a multi-agent system approach to address the difficulties encountered in traditional SCADA systems deployed in critical environments such as electrical power generation, transmission and distribution. The approach models uncertainty and combines multiple sources of uncertain information to deliver robust plan selection. We examine the approach in the context of a simplified power supply/demand scenario using a residential grid connected solar system and consider the challenges of modelling and reasoning with
uncertain sensor information in this environment. We discuss examples of plans and actions required for sensing, establish and discuss the effect of uncertainty on such systems and investigate different uncertainty theories and how they can fuse uncertain information from multiple sources for effective decision making in
such a complex system.