946 resultados para statistical designs


Relevância:

20.00% 20.00%

Publicador:

Resumo:

An interesting fact about language cognition is that stimulation involving incongruence in the merge operation between verb and complement has often been related to a negative event-related potential (ERP) of augmented amplitude and latency of ca. 400 ms - the N400. Using an automatic ERP latency and amplitude estimator to facilitate the recognition of waves with a low signal-to-noise ratio, the objective of the present study was to study the N400 statistically in 24 volunteers. Stimulation consisted of 80 experimental sentences (40 congruous and 40 incongruous), generated in Brazilian Portuguese, involving two distinct local verb-argument combinations (nominal object and pronominal object series). For each volunteer, the EEG was simultaneously acquired at 20 derivations, topographically localized according to the 10-20 International System. A computerized routine for automatic N400-peak marking (based on the ascendant zero-cross of the first waveform derivative) was applied to the estimated individual ERP waveform for congruous and incongruous sentences in both series for all ERP topographic derivations. Peak-to-peak N400 amplitude was significantly augmented (P < 0.05; one-sided Wilcoxon signed-rank test) due to incongruence in derivations F3, T3, C3, Cz, T5, P3, Pz, and P4 for nominal object series and in P3, Pz and P4 for pronominal object series. The results also indicated high inter-individual variability in ERP waveforms, suggesting that the usual procedure of grand averaging might not be considered a generally adequate approach. Hence, signal processing statistical techniques should be applied in neurolinguistic ERP studies allowing waveform analysis with low signal-to-noise ratio.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The influence of some process variables on the productivity of the fractions (liquid yield times fraction percent) obtained from SCFE of a Brazilian mineral coal using isopropanol and ethanol as primary solvents is analyzed using statistical techniques. A full factorial 23 experimental design was adopted to investigate the effects of process variables (temperature, pressure and cosolvent concentration) on the extraction products. The extracts were analyzed by the Preparative Liquid Chromatography-8 fractions method (PLC-8), a reliable, non destructive solvent fractionation method, especially developed for coal-derived liquids. Empirical statistical modeling was carried out in order to reproduce the experimental data. Correlations obtained were always greater than 0.98. Four specific process criteria were used to allow process optimization. Results obtained show that it is not possible to maximize both extract productivity and purity (through the minimization of heavy fraction content) simultaneously by manipulating the mentioned process variables.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this work was to make tofu from soybean cultivar BRS 267 under different processing conditions in order to evaluate the influence of each treatment on the product quality. A fractional factorial 2(5-1) design was used, in which independent variables (thermal treatment, coagulant concentration, coagulation time, curd cutting, and draining time) were tested at two different levels. The response variables studied were hardness, yield, total solids, and protein content of tofu. Polynomial models were generated for each response. To obtain tofu with desirable characteristics (hardness ~4 N, yield 306 g tofu.100 g-1 soybeans, 12 g proteins.100 g-1 tofu and 22 g solids.100 g-1 tofu), the following processing conditions were selected: heating until boiling plus 10 minutes in water bath, 2% dihydrated CaSO4 w/w, 10 minutes coagulation, curd cutting, and 30 minutes draining time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The contents of total phenolic compounds (TPC), total flavonoids (TF), and ascorbic acid (AA) of 18 frozen fruit pulps and their scavenging capacities against peroxyl radical (ROO•), hydrogen peroxide (H2O2), and hydroxyl radical (•OH) were determined. Principal Component Analysis (PCA) showed that TPC (total phenolic compounds) and AA (ascorbic acid) presented positive correlation with the scavenging capacity against ROO•, and TF (total flavonoids) showed positive correlation with the scavenging capacity against •OH and ROO• However, the scavenging capacity against H2O2 presented low correlation with TF (total flavonoids), TPC (total phenolic compounds), and AA (ascorbic acid). The Hierarchical Cluster Analysis (HCA) allowed the classification of the fruit pulps into three groups: one group was formed by the açai pulp with high TF, total flavonoids, content (134.02 mg CE/100 g pulp) and the highest scavenging capacity against ROO•, •OH and H2O2; the second group was formed by the acerola pulp with high TPC, total phenolic compounds, (658.40 mg GAE/100 g pulp) and AA , ascorbic acid, (506.27 mg/100 g pulp) contents; and the third group was formed by pineapple, cacao, caja, cashew-apple, coconut, cupuaçu, guava, orange, lemon, mango, passion fruit, watermelon, pitanga, tamarind, tangerine, and umbu pulps, which could not be separated considering only the contents of bioactive compounds and the scavenging properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The strongest wish of the customer concerning chemical pulp features is consistent, uniform quality. Variation may be controlled and reduced by using statistical methods. However, studies addressing the application and benefits of statistical methods in forest product sector are scarce. Thus, the customer wish is the root cause of the motivation behind this dissertation. The research problem addressed by this dissertation is that companies in the chemical forest product sector require new knowledge for improving their utilization of statistical methods. To gain this new knowledge, the research problem is studied from five complementary viewpoints – challenges and success factors, organizational learning, problem solving, economic benefit, and statistical methods as management tools. The five research questions generated on the basis of these viewpoints are answered in four research papers, which are case studies based on empirical data collection. This research as a whole complements the literature dealing with the use of statistical methods in the forest products industry. Practical examples of the application of statistical process control, case-based reasoning, the cross-industry standard process for data mining, and performance measurement methods in the context of chemical forest products manufacturing are brought to the public knowledge of the scientific community. The benefit of the application of these methods is estimated or demonstrated. The purpose of this dissertation is to find pragmatic ideas for companies in the chemical forest product sector in order for them to improve their utilization of statistical methods. The main practical implications of this doctoral dissertation can be summarized in four points: 1. It is beneficial to reduce variation in chemical forest product manufacturing processes 2. Statistical tools can be used to reduce this variation 3. Problem-solving in chemical forest product manufacturing processes can be intensified through the use of statistical methods 4. There are certain success factors and challenges that need to be addressed when implementing statistical methods

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dr. Gibson was truly involved in nearly every aspect of the formation of Brock University. Pictured here is Dr. Gibson's preliminary design sketches and ideas for the Brock Coat of Arms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New density functionals representing the exchange and correlation energies (per electron) are employed, based on the electron gas model, to calculate interaction potentials of noble gas systems X2 and XY, where X (and Y) are He,Ne,Ar and Kr, and of hydrogen atomrare gas systems H-X. The exchange energy density functional is that recommended by Handler and the correlation energy density functional is a rational function involving two parameters which were optimized to reproduce the correlation energy of He atom. Application of the two parameter function to other rare gas atoms shows that it is "universal"; i. e. ,accurate for the systems considered. The potentials obtained in this work compare well with recent experimental results and are a significant improvement over those from competing statistical modelS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Four problems of physical interest have been solved in this thesis using the path integral formalism. Using the trigonometric expansion method of Burton and de Borde (1955), we found the kernel for two interacting one dimensional oscillators• The result is the same as one would obtain using a normal coordinate transformation, We next introduced the method of Papadopolous (1969), which is a systematic perturbation type method specifically geared to finding the partition function Z, or equivalently, the Helmholtz free energy F, of a system of interacting oscillators. We applied this method to the next three problems considered• First, by summing the perturbation expansion, we found F for a system of N interacting Einstein oscillators^ The result obtained is the same as the usual result obtained by Shukla and Muller (1972) • Next, we found F to 0(Xi)f where A is the usual Tan Hove ordering parameter* The results obtained are the same as those of Shukla and Oowley (1971), who have used a diagrammatic procedure, and did the necessary sums in Fourier space* We performed the work in temperature space• Finally, slightly modifying the method of Papadopolous, we found the finite temperature expressions for the Debyecaller factor in Bravais lattices, to 0(AZ) and u(/K/ j,where K is the scattering vector* The high temperature limit of the expressions obtained here, are in complete agreement with the classical results of Maradudin and Flinn (1963) .

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the complexity of evolutionary design problems grow, so too must the quality of solutions scale to that complexity. In this research, we develop a genetic programming system with individuals encoded as tree-based generative representations to address scalability. This system is capable of multi-objective evaluation using a ranked sum scoring strategy. We examine Hornby's features and measures of modularity, reuse and hierarchy in evolutionary design problems. Experiments are carried out, using the system to generate three-dimensional forms, and analyses of feature characteristics such as modularity, reuse and hierarchy were performed. This work expands on that of Hornby's, by examining a new and more difficult problem domain. The results from these experiments show that individuals encoded with those three features performed best overall. It is also seen, that the measures of complexity conform to the results of Hornby. Moving forward with only this best performing encoding, the system was applied to the generation of three-dimensional external building architecture. One objective considered was passive solar performance, in which the system was challenged with generating forms that optimize exposure to the Sun. The results from these and other experiments satisfied the requirements. The system was shown to scale well to the architectural problems studied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accelerated life testing (ALT) is widely used to obtain reliability information about a product within a limited time frame. The Cox s proportional hazards (PH) model is often utilized for reliability prediction. My master thesis research focuses on designing accelerated life testing experiments for reliability estimation. We consider multiple step-stress ALT plans with censoring. The optimal stress levels and times of changing the stress levels are investigated. We discuss the optimal designs under three optimality criteria. They are D-, A- and Q-optimal designs. We note that the classical designs are optimal only if the model assumed is correct. Due to the nature of prediction made from ALT experimental data, attained under the stress levels higher than the normal condition, extrapolation is encountered. In such case, the assumed model cannot be tested. Therefore, for possible imprecision in the assumed PH model, the method of construction for robust designs is also explored.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the problem of measuring the uncertainty of CGE (or RBC)-type model simulations associated with parameter uncertainty. We describe two approaches for building confidence sets on model endogenous variables. The first one uses a standard Wald-type statistic. The second approach assumes that a confidence set (sampling or Bayesian) is available for the free parameters, from which confidence sets are derived by a projection technique. The latter has two advantages: first, confidence set validity is not affected by model nonlinearities; second, we can easily build simultaneous confidence intervals for an unlimited number of variables. We study conditions under which these confidence sets take the form of intervals and show they can be implemented using standard methods for solving CGE models. We present an application to a CGE model of the Moroccan economy to study the effects of policy-induced increases of transfers from Moroccan expatriates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is well known that standard asymptotic theory is not valid or is extremely unreliable in models with identification problems or weak instruments [Dufour (1997, Econometrica), Staiger and Stock (1997, Econometrica), Wang and Zivot (1998, Econometrica), Stock and Wright (2000, Econometrica), Dufour and Jasiak (2001, International Economic Review)]. One possible way out consists here in using a variant of the Anderson-Rubin (1949, Ann. Math. Stat.) procedure. The latter, however, allows one to build exact tests and confidence sets only for the full vector of the coefficients of the endogenous explanatory variables in a structural equation, which in general does not allow for individual coefficients. This problem may in principle be overcome by using projection techniques [Dufour (1997, Econometrica), Dufour and Jasiak (2001, International Economic Review)]. AR-types are emphasized because they are robust to both weak instruments and instrument exclusion. However, these techniques can be implemented only by using costly numerical techniques. In this paper, we provide a complete analytic solution to the problem of building projection-based confidence sets from Anderson-Rubin-type confidence sets. The latter involves the geometric properties of “quadrics” and can be viewed as an extension of usual confidence intervals and ellipsoids. Only least squares techniques are required for building the confidence intervals. We also study by simulation how “conservative” projection-based confidence sets are. Finally, we illustrate the methods proposed by applying them to three different examples: the relationship between trade and growth in a cross-section of countries, returns to education, and a study of production functions in the U.S. economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We discuss statistical inference problems associated with identification and testability in econometrics, and we emphasize the common nature of the two issues. After reviewing the relevant statistical notions, we consider in turn inference in nonparametric models and recent developments on weakly identified models (or weak instruments). We point out that many hypotheses, for which test procedures are commonly proposed, are not testable at all, while some frequently used econometric methods are fundamentally inappropriate for the models considered. Such situations lead to ill-defined statistical problems and are often associated with a misguided use of asymptotic distributional results. Concerning nonparametric hypotheses, we discuss three basic problems for which such difficulties occur: (1) testing a mean (or a moment) under (too) weak distributional assumptions; (2) inference under heteroskedasticity of unknown form; (3) inference in dynamic models with an unlimited number of parameters. Concerning weakly identified models, we stress that valid inference should be based on proper pivotal functions —a condition not satisfied by standard Wald-type methods based on standard errors — and we discuss recent developments in this field, mainly from the viewpoint of building valid tests and confidence sets. The techniques discussed include alternative proposed statistics, bounds, projection, split-sampling, conditioning, Monte Carlo tests. The possibility of deriving a finite-sample distributional theory, robustness to the presence of weak instruments, and robustness to the specification of a model for endogenous explanatory variables are stressed as important criteria assessing alternative procedures.