932 resultados para Predictive controllers


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The link between off-target anticholinergic effects of medications and acute cognitive impairment in older adults requires urgent investigation. We aimed to determine whether a relevant in vitro model may aid the identification of anticholinergic responses to drugs and the prediction of anticholinergic risk during polypharmacy. In this preliminary study we employed a co-culture of human-derived neurons and astrocytes (NT2.N/A) derived from the NT2 cell line. NT2.N/A cells possess much of the functionality of mature neurons and astrocytes, key cholinergic phenotypic markers and muscarinic acetylcholine receptors (mAChRs). The cholinergic response of NT2 astrocytes to the mAChR agonist oxotremorine was examined using the fluorescent dye fluo-4 to quantitate increases in intracellular calcium [Ca2+]i. Inhibition of this response by drugs classified as severe (dicycloverine, amitriptyline), moderate (cyclobenzaprine) and possible (cimetidine) on the Anticholinergic Cognitive Burden (ACB) scale, was examined after exposure to individual and pairs of compounds. Individually, dicycloverine had the most significant effect regarding inhibition of the astrocytic cholinergic response to oxotremorine, followed by amitriptyline then cyclobenzaprine and cimetidine, in agreement with the ACB scale. In combination, dicycloverine with cyclobenzaprine had the most significant effect, followed by dicycloverine with amitriptyline. The order of potency of the drugs in combination frequently disagreed with predicted ACB scores derived from summation of the individual drug scores, suggesting current scales may underestimate the effect of polypharmacy. Overall, this NT2.N/A model may be appropriate for further investigation of adverse anticholinergic effects of multiple medications, in order to inform clinical choices of suitable drug use in the elderly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The inverse controller is traditionally assumed to be a deterministic function. This paper presents a pedagogical methodology for estimating the stochastic model of the inverse controller. The proposed method is based on Bayes' theorem. Using Bayes' rule to obtain the stochastic model of the inverse controller allows the use of knowledge of uncertainty from both the inverse and the forward model in estimating the optimal control signal. The paper presents the methodology for general nonlinear systems and is demonstrated on nonlinear single-input-single-output (SISO) and multiple-input-multiple-output (MIMO) examples. © 2006 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The acceleration of solid dosage form product development can be facilitated by the inclusion of excipients that exhibit poly-/multi-functionality with reduction of the time invested in multiple excipient optimisations. Because active pharmaceutical ingredients (APIs) and tablet excipients present diverse densification behaviours upon compaction, the involvement of these different powders during compaction makes the compaction process very complicated. The aim of this study was to assess the macrometric characteristics and distribution of surface charges of two powders: indomethacin (IND) and arginine (ARG); and evaluate their impact on the densification properties of the two powders. Response surface modelling (RSM) was employed to predict the effect of two independent variables; Compression pressure (F) and ARG percentage (R) in binary mixtures on the properties of resultant tablets. The study looked at three responses namely; porosity (P), tensile strength (S) and disintegration time (T). Micrometric studies showed that IND had a higher charge density (net charge to mass ratio) when compared to ARG; nonetheless, ARG demonstrated good compaction properties with high plasticity (Y=28.01MPa). Therefore, ARG as filler to IND tablets was associated with better mechanical properties of the tablets (tablet tensile strength (σ) increased from 0.2±0.05N/mm2 to 2.85±0.36N/mm2 upon adding ARG at molar ratio of 8:1 to IND). Moreover, tablets' disintegration time was shortened to reach few seconds in some of the formulations. RSM revealed tablet porosity to be affected by both compression pressure and ARG ratio for IND/ARG physical mixtures (PMs). Conversely, the tensile strength (σ) and disintegration time (T) for the PMs were influenced by the compression pressure, ARG ratio and their interactive term (FR); and a strong correlation was observed between the experimental results and the predicted data for tablet porosity. This work provides clear evidence of the multi-functionality of ARG as filler, binder and disintegrant for directly compressed tablets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Big data comes in various ways, types, shapes, forms and sizes. Indeed, almost all areas of science, technology, medicine, public health, economics, business, linguistics and social science are bombarded by ever increasing flows of data begging to be analyzed efficiently and effectively. In this paper, we propose a rough idea of a possible taxonomy of big data, along with some of the most commonly used tools for handling each particular category of bigness. The dimensionality p of the input space and the sample size n are usually the main ingredients in the characterization of data bigness. The specific statistical machine learning technique used to handle a particular big data set will depend on which category it falls in within the bigness taxonomy. Large p small n data sets for instance require a different set of tools from the large n small p variety. Among other tools, we discuss Preprocessing, Standardization, Imputation, Projection, Regularization, Penalization, Compression, Reduction, Selection, Kernelization, Hybridization, Parallelization, Aggregation, Randomization, Replication, Sequentialization. Indeed, it is important to emphasize right away that the so-called no free lunch theorem applies here, in the sense that there is no universally superior method that outperforms all other methods on all categories of bigness. It is also important to stress the fact that simplicity in the sense of Ockham’s razor non-plurality principle of parsimony tends to reign supreme when it comes to massive data. We conclude with a comparison of the predictive performance of some of the most commonly used methods on a few data sets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research evaluates pattern recognition techniques on a subclass of big data where the dimensionality of the input space (p) is much larger than the number of observations (n). Specifically, we evaluate massive gene expression microarray cancer data where the ratio κ is less than one. We explore the statistical and computational challenges inherent in these high dimensional low sample size (HDLSS) problems and present statistical machine learning methods used to tackle and circumvent these difficulties. Regularization and kernel algorithms were explored in this research using seven datasets where κ < 1. These techniques require special attention to tuning necessitating several extensions of cross-validation to be investigated to support better predictive performance. While no single algorithm was universally the best predictor, the regularization technique produced lower test errors in five of the seven datasets studied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational and communication complexities call for distributed, robust, and adaptive control. This paper proposes a promising way of bottom-up design of distributed control in which simple controllers are responsible for individual nodes. The overall behavior of the network can be achieved by interconnecting such controlled loops in cascade control for example and by enabling the individual nodes to share information about data with their neighbors without aiming at unattainable global solution. The problem is addressed by employing a fully probabilistic design, which can cope with inherent uncertainties, that can be implemented adaptively and which provide a systematic rich way to information sharing. This paper elaborates the overall solution, applies it to linear-Gaussian case, and provides simulation results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel simulation model for pyrolysis processes oflignocellulosicbiomassin AspenPlus (R) was presented at the BC&E 2013. Based on kinetic reaction mechanisms, the simulation calculates product compositions and yields depending on reactor conditions (temperature, residence time, flue gas flow rate) and feedstock composition (biochemical composition, atomic composition, ash and alkali metal content). The simulation model was found to show good correlation with existing publications. In order to further verify the model, own pyrolysis experiments in a 1 kg/h continuously fed fluidized bed fast pyrolysis reactor are performed. Two types of biomass with different characteristics are processed in order to evaluate the influence of the feedstock composition on the yields of the pyrolysis products and their composition. One wood and one straw-like feedstock are used due to their different characteristics. Furthermore, the temperature response of yields and product compositions is evaluated by varying the reactor temperature between 450 and 550 degrees C for one of the feedstocks. The yields of the pyrolysis products (gas, oil, char) are determined and their detailed composition is analysed. The experimental runs are reproduced with the corresponding reactor conditions in the AspenPlus model and the results compared with the experimental findings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The traditional use of global and centralised control methods, fails for large, complex, noisy and highly connected systems, which typify many real world industrial and commercial systems. This paper provides an efficient bottom up design of distributed control in which many simple components communicate and cooperate to achieve a joint system goal. Each component acts individually so as to maximise personal utility whilst obtaining probabilistic information on the global system merely through local message-passing. This leads to an implied scalable and collective control strategy for complex dynamical systems, without the problems of global centralised control. Robustness is addressed by employing a fully probabilistic design, which can cope with inherent uncertainties, can be implemented adaptively and opens a systematic rich way to information sharing. This paper opens the foreseen direction and inspects the proposed design on a linearised version of coupled map lattice with spatiotemporal chaos. A version close to linear quadratic design gives an initial insight into possible behaviours of such networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to examine challenges and potential of big data in heterogeneous business networks and relate these to an implemented logistics solution. Design/methodology/approach – The paper establishes an overview of challenges and opportunities of current significance in the area of big data, specifically in the context of transparency and processes in heterogeneous enterprise networks. Within this context, the paper presents how existing components and purpose-driven research were combined for a solution implemented in a nationwide network for less-than-truckload consignments. Findings – Aside from providing an extended overview of today’s big data situation, the findings have shown that technical means and methods available today can comprise a feasible process transparency solution in a large heterogeneous network where legacy practices, reporting lags and incomplete data exist, yet processes are sensitive to inadequate policy changes. Practical implications – The means introduced in the paper were found to be of utility value in improving process efficiency, transparency and planning in logistics networks. The particular system design choices in the presented solution allow an incremental introduction or evolution of resource handling practices, incorporating existing fragmentary, unstructured or tacit knowledge of experienced personnel into the theoretically founded overall concept. Originality/value – The paper extends previous high-level view on the potential of big data, and presents new applied research and development results in a logistics application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is good evidence that higher global temperature will promote a rise of green house gas levels, implying a positive feedback which will increase the effect of the anthropogenic emissions on global temperatures. Here we present a review about the results which deal with the possible feedbacks between ecosystems and the climate system. There are a lot of types of feedback which are classified. Some circulation models are compared to each other regarding their role in interactive carbon cycle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Bázel–2. tőkeegyezmény bevezetését követően a bankok és hitelintézetek Magyarországon is megkezdték saját belső minősítő rendszereik felépítését, melyek karbantartása és fejlesztése folyamatos feladat. A szerző arra a kérdésre keres választ, hogy lehetséges-e a csőd-előrejelző modellek előre jelző képességét növelni a hagyományos matematikai-statisztikai módszerek alkalmazásával oly módon, hogy a modellekbe a pénzügyi mutatószámok időbeli változásának mértékét is beépítjük. Az empirikus kutatási eredmények arra engednek következtetni, hogy a hazai vállalkozások pénzügyi mutatószámainak időbeli alakulása fontos információt hordoz a vállalkozás jövőbeli fizetőképességéről, mivel azok felhasználása jelentősen növeli a csődmodellek előre jelző képességét. A szerző azt is megvizsgálja, hogy javítja-e a megfigyelések szélsőségesen magas vagy alacsony értékeinek modellezés előtti korrekciója a modellek klasszifikációs teljesítményét. ______ Banks and lenders in Hungary also began, after the introduction of the Basel 2 capital agreement, to build up their internal rating systems, whose maintenance and development are a continuing task. The author explores whether it is possible to increase the predictive capacity of business-failure forecasting models by traditional mathematical-cum-statistical means in such a way that they incorporate the measure of change in the financial indicators over time. Empirical findings suggest that the temporal development of the financial indicators of firms in Hungary carries important information about future ability to pay, since the predictive capacity of bankruptcy forecasting models is greatly increased by using such indicators. The author also examines whether the classification performance of the models can be improved by correcting for extremely high or low values before modelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Homework has been a controversial issue in education for the past century. Research has been scarce and has yielded results at both ends of the spectrum. This study examined the relationship between homework performance (percent of homework completed and percent of homework correct), student characteristics (SAT-9 score, gender, ethnicity, and socio-economic status), perceptions, and challenges and academic achievement determined by the students' average score on weekly tests and their score on the FCAT NRT mathematics assessment. ^ The subjects for this study consisted of 143 students enrolled in Grade 3 at a suburban elementary school in Miami, Florida. Pearson's correlations were used to examine the associations of the predictor variables with average test scores and FCAT NRT scores. Additionally, simultaneous regression analyses were carried out to examine the influence of the predictor variables on each of the criterion variables. Hierarchical regression analyses were performed on the criterion variables from the predictor variables. ^ Homework performance was significantly correlated with average test score. Controlling for the other variables homework performance was highly related to average test score and FCAT NRT score. ^ This study lends support to the view that homework completion is highly related to student academic achievement at the lower elementary level. It is suggested that at the elementary level more consideration be given to the amount of homework completed by students and to utilize the information in formulating intervention strategies for student who may not be achieving at the appropriate levels. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Groundwater systems of different densities are often mathematically modeled to understand and predict environmental behavior such as seawater intrusion or submarine groundwater discharge. Additional data collection may be justified if it will cost-effectively aid in reducing the uncertainty of a model's prediction. The collection of salinity, as well as, temperature data could aid in reducing predictive uncertainty in a variable-density model. However, before numerical models can be created, rigorous testing of the modeling code needs to be completed. This research documents the benchmark testing of a new modeling code, SEAWAT Version 4. The benchmark problems include various combinations of density-dependent flow resulting from variations in concentration and temperature. The verified code, SEAWAT, was then applied to two different hydrological analyses to explore the capacity of a variable-density model to guide data collection. ^ The first analysis tested a linear method to guide data collection by quantifying the contribution of different data types and locations toward reducing predictive uncertainty in a nonlinear variable-density flow and transport model. The relative contributions of temperature and concentration measurements, at different locations within a simulated carbonate platform, for predicting movement of the saltwater interface were assessed. Results from the method showed that concentration data had greater worth than temperature data in reducing predictive uncertainty in this case. Results also indicated that a linear method could be used to quantify data worth in a nonlinear model. ^ The second hydrological analysis utilized a model to identify the transient response of the salinity, temperature, age, and amount of submarine groundwater discharge to changes in tidal ocean stage, seasonal temperature variations, and different types of geology. The model was compared to multiple kinds of data to (1) calibrate and verify the model, and (2) explore the potential for the model to be used to guide the collection of data using techniques such as electromagnetic resistivity, thermal imagery, and seepage meters. Results indicated that the model can be used to give insight to submarine groundwater discharge and be used to guide data collection. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examined the relationship between homework performance (percent of homework completed and percent of homework correct), student characteristics (Stanford Achievement Test score, gender, ethnicity, and socio-economic status), perceptions, and challenges and academic achievement determined by the students’ average score on weekly tests and their score on the Florida Comprehensive Assessment Test (FCAT) Norm Reference Test (NRT) mathematics assessment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Are managerial accounting skills important to all managers? Which of the common managerial accounting skills are the most important to the non- accounting manager? The authors report on their descriptive research gathered from controllers in the hospitality industry which provides guide- lines for managers in these areas.