990 resultados para Sequential function chart
Resumo:
Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the regional scale represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed a downscaling procedure based on a non-linear Bayesian sequential simulation approach. The basic objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity, which is available throughout the model space. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariate kernel density function. This method is then applied to the stochastic integration of low-resolution, re- gional-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities. Finally, the overall viability of this downscaling approach is tested and verified by performing and comparing flow and transport simulation through the original and the downscaled hydraulic conductivity fields. Our results indicate that the proposed procedure does indeed allow for obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.
Resumo:
INTRODUCTION: A significant proportion of prematurely born children encounter behavioral difficulties, such as attention deficit or hyperactivity, which could be due to executive function disorders. AIMS: To examine whether the standard neurodevelopmental assessment offered to premature children in Switzerland recognizes executive function disorders. METHODS: The study population consisted of 49 children born before 29 weeks of gestation who were examined between 5 and 6 years of age with a standard assessment, with additional items to assess executive functioning. Children with severe neurodevelopmental impairment were excluded (mental retardation, cerebral palsy, autism). Standard assessment consisted in the Kaufman Assessment Battery for Children (K-ABC), which comprises three subscales: sequential processes (analysis of sequential information), simultaneous processes (global analysis of visual information), and composite mental processes (CMP) (result of the other two scales), as well as a behavioral evaluation using the standardized Strengths and Difficulties Questionnaire (SDQ). Executive functioning was assessed with tasks evaluating visual attention, divided attention, and digit memory as well as with a specialized questionnaire, the Behavior Rating Index of Executive Functions (BRIEF), which evaluates several aspects of executive function (regulation, attention, flexibility, working memory, etc). RESULTS: Children were divided according to their results on the three K-ABC scales (< or>85), and the different neuropsychological tasks assessing executive function were compared between the groups. The CMP did not differentiate children with executive difficulties, whereas a score<85 on the sequential processes was significantly associated with worse visual and divided attention. There was a strong correlation between the SDQ and the BRIEF questionnaires. For both questionnaires, children receiving psychotherapy had significantly higher results. Children who presented behavioral problems assessed with the SDQ presented significantly higher scores on the BRIEF. CONCLUSION: A detailed analysis of the standard neurodevelopmental assessment allows the identification of executive function disorders in premature children. Children who performed below 85 on the sequential processes of the K-ABC had significantly more attentional difficulties on the neuropsychological tasks and therefore have to be recognized and carefully followed. Emotional regulation had a strong correlation with behavioral difficulties, which were suitably assessed with the SDQ, recognized by the families, and treated.
Effect of carotid and aortic baroreceptors on cardiopulmonary reflex: the role of autonomic function
Resumo:
We determined the sympathetic and parasympathetic control of heart rate (HR) and the sensitivity of the cardiopulmonary receptors after selective carotid and aortic denervation. We also investigated the participation of the autonomic nervous system in the Bezold-Jarish reflex after selective removal of aortic and carotid baroreceptors. Male Wistar rats (220-270 g) were divided into three groups: control (CG, N = 8), aortic denervation (AG, N = 5) and carotid denervation (CAG, N = 9). AG animals presented increased arterial pressure (12%) and HR (11%) compared with CG, while CAG animals presented a reduction in arterial pressure (16%) and unchanged HR compared with CG. The sequential blockade of autonomic effects by atropine and propranolol indicated a reduction in vagal function in CAG (a 50 and 62% reduction in vagal effect and tonus, respectively) while AG showed an increase of more than 100% in sympathetic control of HR. The Bezold-Jarish reflex was evaluated using serotonin, which induced increased bradycardia and hypotension in AG and CAG, suggesting that the sensitivity of the cardiopulmonary reflex is augmented after selective denervation. Atropine administration abolished the bradycardic responses induced by serotonin in all groups; however, the hypotensive response was still increased in AG. Although the responses after atropine were lower than the responses before the drug, indicating a reduction in vagal outflow after selective denervation, our data suggest that both denervation procedures are associated with an increase in sympathetic modulation of the vessels, indicating that the sensitivity of the cardiopulmonary receptors was modulated by baroreceptor fibers.
Resumo:
Cette thèse envisage un ensemble de méthodes permettant aux algorithmes d'apprentissage statistique de mieux traiter la nature séquentielle des problèmes de gestion de portefeuilles financiers. Nous débutons par une considération du problème général de la composition d'algorithmes d'apprentissage devant gérer des tâches séquentielles, en particulier celui de la mise-à-jour efficace des ensembles d'apprentissage dans un cadre de validation séquentielle. Nous énumérons les desiderata que des primitives de composition doivent satisfaire, et faisons ressortir la difficulté de les atteindre de façon rigoureuse et efficace. Nous poursuivons en présentant un ensemble d'algorithmes qui atteignent ces objectifs et présentons une étude de cas d'un système complexe de prise de décision financière utilisant ces techniques. Nous décrivons ensuite une méthode générale permettant de transformer un problème de décision séquentielle non-Markovien en un problème d'apprentissage supervisé en employant un algorithme de recherche basé sur les K meilleurs chemins. Nous traitons d'une application en gestion de portefeuille où nous entraînons un algorithme d'apprentissage à optimiser directement un ratio de Sharpe (ou autre critère non-additif incorporant une aversion au risque). Nous illustrons l'approche par une étude expérimentale approfondie, proposant une architecture de réseaux de neurones spécialisée à la gestion de portefeuille et la comparant à plusieurs alternatives. Finalement, nous introduisons une représentation fonctionnelle de séries chronologiques permettant à des prévisions d'être effectuées sur un horizon variable, tout en utilisant un ensemble informationnel révélé de manière progressive. L'approche est basée sur l'utilisation des processus Gaussiens, lesquels fournissent une matrice de covariance complète entre tous les points pour lesquels une prévision est demandée. Cette information est utilisée à bon escient par un algorithme qui transige activement des écarts de cours (price spreads) entre des contrats à terme sur commodités. L'approche proposée produit, hors échantillon, un rendement ajusté pour le risque significatif, après frais de transactions, sur un portefeuille de 30 actifs.
Resumo:
We study the problem of deriving a complete welfare ordering from a choice function. Under the sequential solution, the best alternative is the alternative chosen from the universal set; the second best is the one chosen when the best alternative is removed; and so on. We show that this is the only completion of Bernheim and Rangel's (2009) welfare relation that satisfies two natural axioms: neutrality, which ensures that the names of the alternatives are welfare-irrelevant; and persistence, which stipulates that every choice function between two welfare-identical choice functions must exhibit the same welfare ordering.
Resumo:
In most classical frameworks for learning from examples, it is assumed that examples are randomly drawn and presented to the learner. In this paper, we consider the possibility of a more active learner who is allowed to choose his/her own examples. Our investigations are carried out in a function approximation setting. In particular, using arguments from optimal recovery (Micchelli and Rivlin, 1976), we develop an adaptive sampling strategy (equivalent to adaptive approximation) for arbitrary approximation schemes. We provide a general formulation of the problem and show how it can be regarded as sequential optimal recovery. We demonstrate the application of this general formulation to two special cases of functions on the real line 1) monotonically increasing functions and 2) functions with bounded derivative. An extensive investigation of the sample complexity of approximating these functions is conducted yielding both theoretical and empirical results on test functions. Our theoretical results (stated insPAC-style), along with the simulations demonstrate the superiority of our active scheme over both passive learning as well as classical optimal recovery. The analysis of active function approximation is conducted in a worst-case setting, in contrast with other Bayesian paradigms obtained from optimal design (Mackay, 1992).
Resumo:
This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.
Resumo:
Evidence has accumulated in recent years that suggests that nitrate from the diet, particularly vegetables, is capable of producing bioactive NO in the vasculature, following bioconversion to nitrite by oral bacteria. The aim of the present review was to consider the current body of evidence for potential beneficial effects of dietary nitrate on blood pressure and endothelial function, with emphasis on evidence from acute and chronic human intervention studies. The studies to date suggest that dietary nitrate acutely lowers blood pressure in healthy humans. An inverse relationship was seen between dose of nitrate consumed and corresponding systolic blood pressure reduction, with doses of nitrate as low as 3 mmol of nitrate reducing systolic blood pressure by 3 mmHg. Moreover, the current studies provide some promising evidence on the beneficial effects of dietary nitrate on endothelial function. In vitro studies suggest a number of potential mechanisms by which dietary nitrate and its sequential reduction to NO may reduce blood pressure and improve endothelial function, such as: acting as a substrate for endothelial NO synthase; increasing vasodilation; inhibiting mitochondrial reactive oxygen species production and platelet aggregation. In conclusion, the evidence for beneficial effects of dietary nitrate on blood pressure and endothelial function is promising. Further long-term randomised controlled human intervention studies assessing the potential effects of dietary nitrate on blood pressure and endothelial function are needed, particularly in individuals with hypertension and at risk of CVD.
Resumo:
A general consistency in the sequential order of petroleum hydrocarbon reduction in previous biodegradation studies has led to the proposal of several molecularly based biodegradation scales. Few studies have investigated the biodegradation susceptibility of petroleum hydrocarbon products in soil media, however, and metabolic preferences can change with habitat type. A laboratory based study comprising gas chromatography–mass spectrometry (GC–MS) analysis of extracts of oil-treated soil samples incubated for up to 161 days was conducted to investigate the biodegradation of crude oil exposed to sandy soils of Barrow Island, home to both a Class ‘‘A” nature reserve and Australia’s largest on-shore oil field. Biodegradation trends of the hydrocarbon-treated soils were largely consistent with previous reports but some unusual behaviour was recognised both between and within hydrocarbon classes. For example, the n-alkanes persisted at trace levels from day 86 to 161 following the removal of typically more stable dimethyl naphthalenes and methyl phenanthrenes. The relative susceptibility to biodegradation of different di- tri- and tetramethylnaphthalene isomers also showed several features distinct from previous reports. The unique biodegradation behaviour of Barrow Is. soil likely reflects difference in microbial functioning with physiochemical variation in the environment. Correlation of molecular parameters, reduction rates of selected alkyl naphthalene isomers and CO2 respiration values with a delayed (61 d) oil-treated soil identified a slowing of biodegradation with microcosm incubation; a reduced function or population of incubated soil flora might also influence the biodegradation patterns observed.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
An economic-statistical model is developed for variable parameters (VP) (X) over bar charts in which all design parameters vary adaptively, that is, each of the design parameters (sample size, sampling interval and control-limit width) vary as a function of the most recent process information. The cost function due to controlling the process quality through a VP (X) over bar chart is derived. During the optimization of the cost function, constraints are imposed on the expected times to signal when the process is in and out of control. In this way, required statistical properties can be assured. Through a numerical example, the proposed economic-statistical design approach for VP (X) over bar charts is compared to the economic design for VP (X) over bar charts and to the economic-statistical and economic designs for fixed parameters (FP) (X) over bar charts in terms of the operating cost and the expected times to signal. From this example, it is possible to assess the benefits provided by the proposed model. Varying some input parameters, their effect on the optimal cost and on the optimal values of the design parameters was analysed.
Resumo:
In this article, we consider the synthetic control chart with two-stage sampling (SyTS chart) to control the process mean and variance. During the first stage, one item of the sample is inspected; if its value X, is close to the target value of the process mean, then the sampling is interrupted. Otherwise, the sampling goes on to the second stage, where the remaining items are inspected and the statistic T = Sigma [x(i) - mu(0) + xi sigma(0)](2) is computed taking into account all items of the sample. The design parameter is function of X-1. When the statistic T is larger than a specified value, the sample is classified as nonconforming. According to the synthetic procedure, the signal is based on Conforming Run Length (CRL). The CRL is the number of samples taken from the process since the previous nonconforming sample until the occurrence of the next nonconforming sample. If the CRL is sufficiently small, then a signal is generated. A comparative study shows that the SyTS chart and the joint X and S charts with double sampling are very similar in performance. However, from the practical viewpoint, the SyTS chart is more convenient to administer than the joint charts.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
We develop an economic model for X̄ control charts having all design parameters varying in an adaptive way, that is, in real time considering current sample information. In the proposed model, each of the design parameters can assume two values as a function of the most recent process information. The cost function is derived and it provides a device for optimal selection of the design parameters. Through a numerical example one can foresee the savings that the developed model possibly provides. © 2001 Elsevier Science B.V. All rights reserved.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)