106 resultados para CHEMORECEPTOR INPUTS


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Increasingly semiconductor manufacturers are exploring opportunities for virtual metrology (VM) enabled process monitoring and control as a means of reducing non-value added metrology and achieving ever more demanding wafer fabrication tolerances. However, developing robust, reliable and interpretable VM models can be very challenging due to the highly correlated input space often associated with the underpinning data sets. A particularly pertinent example is etch rate prediction of plasma etch processes from multichannel optical emission spectroscopy data. This paper proposes a novel input-clustering based forward stepwise regression methodology for VM model building in such highly correlated input spaces. Max Separation Clustering (MSC) is employed as a pre-processing step to identify a reduced srt of well-conditioned, representative variables that can then be used as inputs to state-of-the-art model building techniques such as Forward Selection Regression (FSR), Ridge regression, LASSO and Forward Selection Ridge Regression (FCRR). The methodology is validated on a benchmark semiconductor plasma etch dataset and the results obtained are compared with those achieved when the state-of-art approaches are applied directly to the data without the MSC pre-processing step. Significant performance improvements are observed when MSC is combined with FSR (13%) and FSRR (8.5%), but not with Ridge Regression (-1%) or LASSO (-32%). The optimal VM results are obtained using the MSC-FSR and MSC-FSRR generated models. © 2012 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives: The Liverpool Care Pathway for the dying patient (LCP) was designed to improve end-of-life care in generalist health care settings. Controversy has led to its withdrawal in some jurisdictions. The main objective of this research was to identify the influences that facilitated or hindered successful LCP implementation.

Method: An organisational case study using realist evaluation in one health and social care trust in Northern Ireland. Two rounds of semi-structured interviews were conducted with two policy makers and twenty two participants with experience and/or involvement in management of the LCP during 2011 and 2012.

Results: Key resource inputs included facilitation with a view to maintaining LCP ‘visibility’, reducing anxiety among nurses and increasing their confidence regarding the delivery of end-of-life care; and nurse and medical education designed to increase professional self-efficacy and reduce misuse and misunderstanding of the LCP. Key enabling contexts were consistent senior management support; ongoing education and training tailored to the needs of each professional group; and an organisational cultural change in the hospital setting that encompassed end-of-life care.

Conclusion: There is a need to appreciate the organizationally complex nature of intervening to improve end-of-life care. Successful implementation of evidence-based interventions for end-of-life care requires commitment to planning, training and ongoing review that takes account of different perspectives, institutional hierarchies and relationships and the educational needs of professional disciplines. There is a need also to recognise that medical consultants require particular support in their role as gatekeepers and as a lead communication channel with patients and their relatives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identifying groundwater contributions to baseflowforms an essential part of surfacewater body characterisation. The Gortinlieve catchment (5 km2) comprises a headwater stream network of the Carrigans River, itself a tributary of the River Foyle, NW Ireland. The bedrock comprises poorly productive metasediments that are characterised by fracture porosity. We present the findings of a multi-disciplinary study that integrates new hydrochemical and mineralogical investigations with existing hydraulic, geophysical and structural data to identify the scales of groundwater flow and the nature of groundwater/bedrock interaction (chemical denudation). At the catchment scale, the development of deep weathering profiles is controlled by NE-SW regional scale fracture zones associated with mountain building during the Grampian orogeny. In-situ chemical denudation of mineral phases is controlled by micro- to meso-scale fractures related to Alpine compression during Palaeocene to Oligocene times. The alteration of primary muscovite, chlorite (clinochlore) and albite along the surfaces of these small-scale fractures has resulted in the precipitation of illite, montmorillonite and illite/montmorillonite clay admixtures. The interconnected but discontinuous nature of these small-scale structures highlights the role of larger scale faults and fissures in the supply and transportation of weathering solutions to/from the sites of mineral weathering. The dissolution of primarily mineral phases releases the major ions Mg, Ca and HCO3 that are shown to subsequently formthe chemical makeup of groundwaters. Borehole groundwater and stream baseflow hydrochemical data are used to constrain the depths of groundwater flow pathways influencing the chemistry of surface waters throughout the stream profile. The results show that it is predominantly the lower part of the catchment, which receives inputs from catchment/regional scale groundwater flow, that is found to contribute to the maintenance of annual baseflow levels. This study identifies the importance
of deep groundwater in maintaining annual baseflow levels in poorly productive bedrock systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the local order estimation of nonlinear autoregressive systems with exogenous inputs (NARX), which may have different local dimensions at different points. By minimizing the kernel-based local information criterion introduced in this paper, the strongly consistent estimates for the local orders of the NARX system at points of interest are obtained. The modification of the criterion and a simple procedure of searching the minimum of the criterion, are also discussed. The theoretical results derived here are tested by simulation examples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present the results of exploratory experiments using lexical valence extracted from brain using electroencephalography (EEG) for sentiment analysis. We selected 78 English words (36 for training and 42 for testing), presented as stimuli to 3 English native speakers. EEG signals were recorded from the subjects while they performed a mental imaging task for each word stimulus. Wavelet decomposition was employed to extract EEG features from the time-frequency domain. The extracted features were used as inputs to a sparse multinomial logistic regression (SMLR) classifier for valence classification, after univariate ANOVA feature selection. After mapping EEG signals to sentiment valences, we exploited the lexical polarity extracted from brain data for the prediction of the valence of 12 sentences taken from the SemEval-2007 shared task, and compared it against existing lexical resources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The original goals of the JET ITER-like wall included the study of the impact of an all W divertor on plasma operation (Coenen et al 2013 Nucl. Fusion 53 073043) and fuel retention (Brezinsek et al 2013 Nucl. Fusion 53 083023). ITER has recently decided to install a full-tungsten (W) divertor from the start of operations. One of the key inputs required in support of this decision was the study of the possibility of W melting and melt splashing during transients. Damage of this type can lead to modifications of surface topology which could lead to higher disruption frequency or compromise subsequent plasma operation. Although every effort will be made to avoid leading edges, ITER plasma stored energies are sufficient that transients can drive shallow melting on the top surfaces of components. JET is able to produce ELMs large enough to allow access to transient melting in a regime of relevance to ITER.

Transient W melt experiments were performed in JET using a dedicated divertor module and a sequence of I-P = 3.0 MA/B-T = 2.9 T H-mode pulses with an input power of P-IN = 23 MW, a stored energy of similar to 6 MJ and regular type I ELMs at Delta W-ELM = 0.3 MJ and f(ELM) similar to 30 Hz. By moving the outer strike point onto a dedicated leading edge in the W divertor the base temperature was raised within similar to 1 s to a level allowing transient, ELM-driven melting during the subsequent 0.5 s. Such ELMs (delta W similar to 300 kJ per ELM) are comparable to mitigated ELMs expected in ITER (Pitts et al 2011 J. Nucl. Mater. 415 (Suppl.) S957-64).

Although significant material losses in terms of ejections into the plasma were not observed, there is indirect evidence that some small droplets (similar to 80 mu m) were released. Almost 1 mm (similar to 6 mm(3)) of W was moved by similar to 150 ELMs within 7 subsequent discharges. The impact on the main plasma parameters was minor and no disruptions occurred. The W-melt gradually moved along the leading edge towards the high-field side, driven by j x B forces. The evaporation rate determined from spectroscopy is 100 times less than expected from steady state melting and is thus consistent only with transient melting during the individual ELMs. Analysis of IR data and spectroscopy together with modelling using the MEMOS code Bazylev et al 2009 J. Nucl. Mater. 390-391 810-13 point to transient melting as the main process. 3D MEMOS simulations on the consequences of multiple ELMs on damage of tungsten castellated armour have been performed.

These experiments provide the first experimental evidence for the absence of significant melt splashing at transient events resembling mitigated ELMs on ITER and establish a key experimental benchmark for the MEMOS code.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Belief revision performs belief change on an agent’s beliefs when new evidence (either of the form of a propositional formula or of the form of a total pre-order on a set of interpretations) is received. Jeffrey’s rule is commonly used for revising probabilistic epistemic states when new information is probabilistically uncertain. In this paper, we propose a general epistemic revision framework where new evidence is of the form of a partial epistemic state. Our framework extends Jeffrey’s rule with uncertain inputs and covers well-known existing frameworks such as ordinal conditional function (OCF) or possibility theory. We then define a set of postulates that such revision operators shall satisfy and establish representation theorems to characterize those postulates. We show that these postulates reveal common characteristics of various existing revision strategies and are satisfied by OCF conditionalization, Jeffrey’s rule of conditioning and possibility conditionalization. Furthermore, when reducing to the belief revision situation, our postulates can induce Darwiche and Pearl’s postulates C1 and C2.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Belief revision is the process that incorporates, in a consistent way,
a new piece of information, called input, into a belief base. When both belief
bases and inputs are propositional formulas, a set of natural and rational properties, known as AGM postulates, have been proposed to define genuine revision operations. This paper addresses the following important issue : How to revise a partially pre-ordered information (representing initial beliefs) with a new partially pre-ordered information (representing inputs) while preserving AGM postulates? We first provide a particular representation of partial pre-orders (called units) using the concept of closed sets of units. Then we restate AGM postulates in this framework by defining counterparts of the notions of logical entailment and logical consistency. In the second part of the paper, we provide some examples of revision operations that respect our set of postulates. We also prove that our revision methods extend well-known lexicographic revision and natural revision for both cases where the input is either a single propositional formula or a total pre-order.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Discrimination of different species in various target scopes within a single sensing platform can provide many advantages such as simplicity, rapidness, and cost effectiveness. Here we design a three-input colorimetric logic gate based on the aggregation and anti-aggregation of gold nanoparticles (Au NPs) for the sensing of melamine, cysteine, and Hg2+. The concept takes advantages of the highly specific coordination and ligand replacement reactions between melamine, cysteine, Hg2+, and Au NPs. Different outputs are obtained with the combinational inputs in the logic gates, which can serve as a reference to discriminate different analytes within a single sensing platform. Furthermore, besides the intrinsic sensitivity and selectivity of Au NPs to melamine-like compounds, the “INH” gates of melamine/cysteine and melamine/Hg2+ in this logic system can be employed for sensitive and selective detections of cysteine and Hg2+, respectively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a mathematically rigorous Quality-of-Service (QoS) metric which relates the achievable quality of service metric (QoS) for a real-time analytics service to the server energy cost of offering the service. Using a new iso-QoS evaluation methodology, we scale server resources to meet QoS targets and directly rank the servers in terms of their energy-efficiency and by extension cost of ownership. Our metric and method are platform-independent and enable fair comparison of datacenter compute servers with significant architectural diversity, including micro-servers. We deploy our metric and methodology to compare three servers running financial option pricing workloads on real-life market data. We find that server ranking is sensitive to data inputs and desired QoS level and that although scale-out micro-servers can be up to two times more energy-efficient than conventional heavyweight servers for the same target QoS, they are still six times less energy efficient than high-performance computational accelerators.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Titanium alloy exhibits an excellent combination of bio-compatibility, corrosion resistance, strength and toughness. The microstructure of an alloy influences the properties. The microstructures depend mainly on alloying elements, method of production, mechanical, and thermal treatments. The relationships between these variables and final properties of the alloy are complex, non-linear in nature, which is the biggest hurdle in developing proper correlations between them by conventional methods. So, we developed artificial neural networks (ANN) models for solving these complex phenomena in titanium alloys.

In the present work, ANN models were used for the analysis and prediction of the correlation between the process parameters, the alloying elements, microstructural features, beta transus temperature and mechanical properties in titanium alloys. Sensitivity analysis of trained neural network models were studied which resulted a better understanding of relationships between inputs and outputs. The model predictions and the analysis are well in agreement with the experimental results. The simulation results show that the average output-prediction error by models are less than 5% of the prediction range in more than 95% of the cases, which is quite acceptable for all metallurgical purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Carbon and nitrogen stable isotope analysis (SIA) has identified the terrestrial subsidy of freshwater food-webs but relies on different 13C fractionation in aquatic and terrestrial primary producers. However dissolved inorganic carbon (DIC) is partly comprised of 13C depleted respiration of terrestrial C and ‘old’ C derived from weathering of catchment geology. SIA thus fails to differentiate between the contribution of old and recently fixed terrestrial C. DIC in alkaline lakes is partially derived from weathering of 14C-free carbonaceous bedrock This
yields an artificial age offset leading samples to appear significantly older than their actual age. As such, 14C can be used as a biomarker to identify the proportion of autochthonous C in the food-web. With terrestrial C inputs likely to increase, the origin and utilisation of ‘old’ or ‘recent’ allochthonous C in the food-web can also be determined. Stable isotopes and 14C were measured for biota, particulate organic matter (POM), DIC and dissolved organic carbon (DOC) from Lough Erne, Northern Ireland, a humic but alkaline lake. High winter δ15N values in calanoid zooplankton (δ15N =24‰) relative to phytoplankton and POM (δ15N =6‰ and 12‰ respectively) may reflect several microbial trophic levels between terrestrial C and calanoids. Furthermore winter calanoid 14C ages are consistent with DOC from inflowing rivers (87 and 75 years BP respectively) but not phytoplankton (355 years BP). Summer calanoid δ13N, δ15N and 14C (312 years BP) indicate greater reliance on phytoplankton. There is also temporal and spatial variation in DIC, DOC and POM C isotopes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Globally lakes bury and remineralise significant quantities of terrestrial C, and the associated flux of terrestrial C strongly influences their functioning. Changing deposition chemistry, land use and climate induced impacts on hydrology will affect soil biogeochemistry and terrestrial C export1 and hence lake ecology with potential feedbacks for regional and global C cycling. C and nitrogen stable isotope analysis (SIA) has identified the terrestrial subsidy of freshwater food webs. The approach relies on different 13C fractionation in aquatic and terrestrial primary producers, but also that inorganic C demands of aquatic primary producers are partly met by 13C depleted C from respiration of terrestrial C, and ‘old’ C derived from weathering of catchment geology. SIA thus fails to differentiate between the contributions of old and recently fixed terrestrial C. Natural abundance 14C can be used as an additional biomarker to untangle riverine food webs2 where aquatic and terrestrial δ 13C overlap, but may also be valuable for examining the age and origin of C in the lake. Primary production in lakes is based on dissolved inorganic C (DIC). DIC in alkaline lakes is partially derived from weathering of carbonaceous bedrock, a proportion of which is14C-free. The low 14C activity yields an artificial age offset leading samples to appear hundreds to thousands of years older than their actual age. As such, 14C can be used to identify the proportion of autochthonous C in the food-web. With terrestrial C inputs likely to increase, the origin and utilisation of ‘fossil’ or ‘recent’ allochthonous C in the food-web can also be determined. Stable isotopes and 14C were measured for biota, particulate organic matter (POM), DIC and dissolved organic carbon (DOC) from Lough Erne, Northern Ireland, a humic alkaline lake. Temporal and spatial variation was evident in DIC, DOC and POM C isotopes with implications for the fluctuation in terrestrial export processes. Ramped pyrolysis of lake surface sediment indicates the burial of two C components. 14C activity (507 ± 30 BP) of sediment combusted at 400˚C was consistent with algal values and younger than bulk sediment values (1097 ± 30 BP). The sample was subsequently combusted at 850˚C, yielding 14C values (1471 ± 30 BP) older than the bulk sediment age, suggesting that fossil terrestrial carbon is also buried in the sediment. Stable isotopes in the food web indicate that terrestrial organic C is also utilised by lake organisms. High winter δ 15N values in calanoid zooplankton (δ 15N = 24%¸) relative to phytoplankton and POM (δ 15N = 6h and 12h respectively) may reflect several microbial trophic levels between terrestrial C and calanoids. Furthermore winter calanoid 14C ages are consistent with DOC from an inflowing river (75 ± 24 BP), not phytoplankton (367 ± 70 BP). Summer calanoid δ 13C, δ 15N and 14C (345 ± 80 BP) indicate greater reliance on phytoplankton.

1 Monteith, D.T et al., (2007) Dissolved organic carbon trends resulting from changes in atmospheric deposition chemistry. Nature, 450:537-535

2 Caraco, N., et al.,(2010) Millennial-aged organic carbon subsidies to a modern river food web. Ecology,91: 2385-2393.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A forward and backward least angle regression (LAR) algorithm is proposed to construct the nonlinear autoregressive model with exogenous inputs (NARX) that is widely used to describe a large class of nonlinear dynamic systems. The main objective of this paper is to improve model sparsity and generalization performance of the original forward LAR algorithm. This is achieved by introducing a replacement scheme using an additional backward LAR stage. The backward stage replaces insignificant model terms selected by forward LAR with more significant ones, leading to an improved model in terms of the model compactness and performance. A numerical example to construct four types of NARX models, namely polynomials, radial basis function (RBF) networks, neuro fuzzy and wavelet networks, is presented to illustrate the effectiveness of the proposed technique in comparison with some popular methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel model-based principal component analysis (PCA) method is proposed in this paper for wide-area power system monitoring, aiming to tackle one of the critical drawbacks of the conventional PCA, i.e. the incapability to handle non-Gaussian distributed variables. It is a significant extension of the original PCA method which has already shown to outperform traditional methods like rate-of-change-of-frequency (ROCOF). The ROCOF method is quick for processing local information, but its threshold is difficult to determine and nuisance tripping may easily occur. The proposed model-based PCA method uses a radial basis function neural network (RBFNN) model to handle the nonlinearity in the data set to solve the no-Gaussian issue, before the PCA method is used for islanding detection. To build an effective RBFNN model, this paper first uses a fast input selection method to remove insignificant neural inputs. Next, a heuristic optimization technique namely Teaching-Learning-Based-Optimization (TLBO) is adopted to tune the nonlinear parameters in the RBF neurons to build the optimized model. The novel RBFNN based PCA monitoring scheme is then employed for wide-area monitoring using the residuals between the model outputs and the real PMU measurements. Experimental results confirm the efficiency and effectiveness of the proposed method in monitoring a suite of process variables with different distribution characteristics, showing that the proposed RBFNN PCA method is a reliable scheme as an effective extension to the linear PCA method.