954 resultados para Convergence Analysis
Resumo:
Data fluctuation in multiple measurements of Laser Induced Breakdown Spectroscopy (LIBS) greatly affects the accuracy of quantitative analysis. A new LIBS quantitative analysis method based on the Robust Least Squares Support Vector Machine (RLS-SVM) regression model is proposed. The usual way to enhance the analysis accuracy is to improve the quality and consistency of the emission signal, such as by averaging the spectral signals or spectrum standardization over a number of laser shots. The proposed method focuses more on how to enhance the robustness of the quantitative analysis regression model. The proposed RLS-SVM regression model originates from the Weighted Least Squares Support Vector Machine (WLS-SVM) but has an improved segmented weighting function and residual error calculation according to the statistical distribution of measured spectral data. Through the improved segmented weighting function, the information on the spectral data in the normal distribution will be retained in the regression model while the information on the outliers will be restrained or removed. Copper elemental concentration analysis experiments of 16 certified standard brass samples were carried out. The average value of relative standard deviation obtained from the RLS-SVM model was 3.06% and the root mean square error was 1.537%. The experimental results showed that the proposed method achieved better prediction accuracy and better modeling robustness compared with the quantitative analysis methods based on Partial Least Squares (PLS) regression, standard Support Vector Machine (SVM) and WLS-SVM. It was also demonstrated that the improved weighting function had better comprehensive performance in model robustness and convergence speed, compared with the four known weighting functions.
Resumo:
2000 Mathematics Subject Classification: 35Q02, 35Q05, 35Q10, 35B40.
Resumo:
The purpose of this dissertation was to investigate cross-cultural differences in the use of the Internet. Hofstede's model of national culture was employed as the theoretical foundation for the analysis of cross-cultural differences. Davis's technology acceptance model was employed as the theoretical foundation for the analysis of Internet use. ^ Secondary data from an on-line survey of Internet users in 22 countries conducted in April 1997 by the Georgia Tech Research Corporation measured the dependent variables of Internet use and the independent variables of attitudes toward technology. Hofstede's stream of research measured the independent variables of the five dimensions of national culture. ^ Contrary to expectations, regression analyses at the country level of analysis did not detect cultural differences. As expected, regression analyses at the individual level of analysis did detect cultural differences. The results indicated that perceived usefulness was related to the frequency of Internet shopping in the Germanic and Anglo clusters, where masculinity was high. Perceived ease of use was related to the frequency of Internet shopping in the Latin cluster, where uncertainty avoidance was high. Neither perceived usefulness nor perceived ease of use was related to the frequency of Internet shopping in the Nordic cluster, where masculinity and uncertainty avoidance were low. ^ As expected, analysis of variance at the cluster level of analysis indicated that censorship was a greater concern in Germany and Anglo countries, where masculinity was high. Government regulation of the Internet was less preferred in Germany, where power distance was low. Contrary to expectations, concern for transaction security. was lower in the Latin cluster, where uncertainty avoidance was high. Concern for privacy issues was lower in the U.S., where individualism was high. ^ In conclusion, results suggested that Internet users represented a multicultural community, not a standardized virtual community. Based on the findings, specific guidance was provided on how international managers and marketers could develop culturally sensitive strategies for training and promoting Internet services. ^
Resumo:
Engineering analysis in geometric models has been the main if not the only credible/reasonable tool used by engineers and scientists to resolve physical boundaries problems. New high speed computers have facilitated the accuracy and validation of the expected results. In practice, an engineering analysis is composed of two parts; the design of the model and the analysis of the geometry with the boundary conditions and constraints imposed on it. Numerical methods are used to resolve a large number of physical boundary problems independent of the model geometry. The time expended due to the computational process are related to the imposed boundary conditions and the well conformed geometry. Any geometric model that contains gaps or open lines is considered an imperfect geometry model and major commercial solver packages are incapable of handling such inputs. Others packages apply different kinds of methods to resolve this problems like patching or zippering; but the final resolved geometry may be different from the original geometry, and the changes may be unacceptable. The study proposed in this dissertation is based on a new technique to process models with geometrical imperfection without the necessity to repair or change the original geometry. An algorithm is presented that is able to analyze the imperfect geometric model with the imposed boundary conditions using a meshfree method and a distance field approximation to the boundaries. Experiments are proposed to analyze the convergence of the algorithm in imperfect models geometries and will be compared with the same models but with perfect geometries. Plotting results will be presented for further analysis and conclusions of the algorithm convergence
Resumo:
This research explores Bayesian updating as a tool for estimating parameters probabilistically by dynamic analysis of data sequences. Two distinct Bayesian updating methodologies are assessed. The first approach focuses on Bayesian updating of failure rates for primary events in fault trees. A Poisson Exponentially Moving Average (PEWMA) model is implemnented to carry out Bayesian updating of failure rates for individual primary events in the fault tree. To provide a basis for testing of the PEWMA model, a fault tree is developed based on the Texas City Refinery incident which occurred in 2005. A qualitative fault tree analysis is then carried out to obtain a logical expression for the top event. A dynamic Fault Tree analysis is carried out by evaluating the top event probability at each Bayesian updating step by Monte Carlo sampling from posterior failure rate distributions. It is demonstrated that PEWMA modeling is advantageous over conventional conjugate Poisson-Gamma updating techniques when failure data is collected over long time spans. The second approach focuses on Bayesian updating of parameters in non-linear forward models. Specifically, the technique is applied to the hydrocarbon material balance equation. In order to test the accuracy of the implemented Bayesian updating models, a synthetic data set is developed using the Eclipse reservoir simulator. Both structured grid and MCMC sampling based solution techniques are implemented and are shown to model the synthetic data set with good accuracy. Furthermore, a graphical analysis shows that the implemented MCMC model displays good convergence properties. A case study demonstrates that Likelihood variance affects the rate at which the posterior assimilates information from the measured data sequence. Error in the measured data significantly affects the accuracy of the posterior parameter distributions. Increasing the likelihood variance mitigates random measurement errors, but casuses the overall variance of the posterior to increase. Bayesian updating is shown to be advantageous over deterministic regression techniques as it allows for incorporation of prior belief and full modeling uncertainty over the parameter ranges. As such, the Bayesian approach to estimation of parameters in the material balance equation shows utility for incorporation into reservoir engineering workflows.
Resumo:
College radio is subject to constant transformation. The rise of information and communications technologies has allowed its development and modernization as well as the proliferation of new professional profiles. This research aims to analyze the connection of the Spanish university stations with the relevant qualifications. To do this, we use a comparative methodology based on examination of descriptors and specific principal subjects of degrees such as Audiovisual Communication and Journalism, although we will be using other degrees that are acquiring more responsibilities in a radio station, such as the degree in Information and Documentation. In conclusion, structural weaknesses in the university environment are observed, but future possibilities are detected as well: A college radio station is a place of convergence between the training of students from different degrees and reality itself, a versatile medium that provides the relevance of being the first professional contact of college students with the labor market.
Resumo:
In recent papers, Wied and his coauthors have introduced change-point procedures to detect and estimate structural breaks in the correlation between time series. To prove the asymptotic distribution of the test statistic and stopping time as well as the change-point estimation rate, they use an extended functional Delta method and assume nearly constant expectations and variances of the time series. In this thesis, we allow asymptotically infinitely many structural breaks in the means and variances of the time series. For this setting, we present test statistics and stopping times which are used to determine whether or not the correlation between two time series is and stays constant, respectively. Additionally, we consider estimates for change-points in the correlations. The employed nonparametric statistics depend on the means and variances. These (nuisance) parameters are replaced by estimates in the course of this thesis. We avoid assuming a fixed form of these estimates but rather we use "blackbox" estimates, i.e. we derive results under assumptions that these estimates fulfill. These results are supplement with examples. This thesis is organized in seven sections. In Section 1, we motivate the issue and present the mathematical model. In Section 2, we consider a posteriori and sequential testing procedures, and investigate convergence rates for change-point estimation, always assuming that the means and the variances of the time series are known. In the following sections, the assumptions of known means and variances are relaxed. In Section 3, we present the assumptions for the mean and variance estimates that we will use for the mean in Section 4, for the variance in Section 5, and for both parameters in Section 6. Finally, in Section 7, a simulation study illustrates the finite sample behaviors of some testing procedures and estimates.
Resumo:
The dissertation is devoted to the study of problems in calculus of variation, free boundary problems and gradient flows with respect to the Wasserstein metric. More concretely, we consider the problem of characterizing the regularity of minimizers to a certain interaction energy. Minimizers of the interaction energy have a somewhat surprising relationship with solutions to obstacle problems. Here we prove and exploit this relationship to obtain novel regularity results. Another problem we tackle is describing the asymptotic behavior of the Cahn-Hilliard equation with degenerate mobility. By framing the Cahn-Hilliard equation with degenerate mobility as a gradient flow in Wasserstein metric, in one space dimension, we prove its convergence to a degenerate parabolic equation under the framework recently developed by Sandier-Serfaty.
Resumo:
In this work, we further extend the recently developed adaptive data analysis method, the Sparse Time-Frequency Representation (STFR) method. This method is based on the assumption that many physical signals inherently contain AM-FM representations. We propose a sparse optimization method to extract the AM-FM representations of such signals. We prove the convergence of the method for periodic signals under certain assumptions and provide practical algorithms specifically for the non-periodic STFR, which extends the method to tackle problems that former STFR methods could not handle, including stability to noise and non-periodic data analysis. This is a significant improvement since many adaptive and non-adaptive signal processing methods are not fully capable of handling non-periodic signals. Moreover, we propose a new STFR algorithm to study intrawave signals with strong frequency modulation and analyze the convergence of this new algorithm for periodic signals. Such signals have previously remained a bottleneck for all signal processing methods. Furthermore, we propose a modified version of STFR that facilitates the extraction of intrawaves that have overlaping frequency content. We show that the STFR methods can be applied to the realm of dynamical systems and cardiovascular signals. In particular, we present a simplified and modified version of the STFR algorithm that is potentially useful for the diagnosis of some cardiovascular diseases. We further explain some preliminary work on the nature of Intrinsic Mode Functions (IMFs) and how they can have different representations in different phase coordinates. This analysis shows that the uncertainty principle is fundamental to all oscillating signals.
Resumo:
The aim of this paper was to obtain evidence of the validity of the LSB-50 (de Rivera & Abuín, 2012), a screening measure of psychopathology, in Argentinean adolescents. The sample consisted of 1002 individuals (49.7% male; 50.3% female) between 12 and 18 years-old (M = 14.98; SD = 1.99). A cross-validation study and factorial invariance studies were performed in samples divided by sex and age to test if a seven-factor structure that corresponds to seven clinical scales (Hypersensitivity, Obsessive-Compulsive, Anxiety, Hostility, Somatization, Depression, and Sleep disturbance) was adequate for the LSB-50. The seven-factor structure proved to be suitable for all the subsamples. Next, the fit of the seven-factor structure was studied simultaneously? in the aforementioned subsamples through hierarchical models that imposed different constrains of equivalency?. Results indicated the invariance of the seven clinical dimensions of the LSB-50. Ordinal alphas showed good internal consistency for all the scales. Finally, the correlations with a diagnostic measure of psychopathology (PAI-A) indicated moderate convergence. It is concluded that the analyses performed provide robust evidence of construct validity for the LSB-50
Resumo:
My dissertation emphasizes the use of narrative structuralism and narrative theories about storytelling in order to build a discourse between the fields of New Media and Rhetoric and Composition. Propp's morphological analysis and the breaking down of stories into component pieces aides in the discussion of storytelling as it appears in and is mediated by digital and computer technologies. New Media and Rhetoric and Composition are aided by shared concerns for textual production and consumption. In using the notion of "kairotic reading" (KR), I show the interconnectedness and interdisciplinarity required in the development of pedagogy utilized to teach students to develop into reflective practitioners that are aware of their rhetorical surroundings and can made sound judgments concerning their own message generation and consumption in the workplace. KR is a transferable skill that is beneficial to students and teachers alike. The dissertation research utilizes theories of New Media and New Media-influenced practitioners, including Jenkins' theory of convergence, Bourdieu's notion of taste, Gee's term "semiotic domains," and Manovich's "modification." These theoretical pieces are combined in order to show how KR can be extended by convergent narrative practices. In order to build connections with New Media, the consideration and inclusion of Kress and van Leeuwen's multimodality, Selber's "reflective practitioners," and Selfe's definition of multimodal composing allow for a greater establishment of conversation order to create a richer conversation around the implications of metacognitive development and practitioner reflexivity with scholars in New Media. My research also includes analysis of two popular media franchises Deborah Harkness' A Discovery of Witches and Fox's Bones television series to show similarities and differences among convergence-linked and multimodal narratives. Lastly, I also provide example assignments that can be taken, further developed, and utilized in classrooms engaging in multimodal composing practices. This dissertation pushes consideration of New Media into the work already being performed by those in Rhetoric and Composition.
Resumo:
Raman spectroscopy of formamide-intercalated kaolinites treated using controlled-rate thermal analysis technology (CRTA), allowing the separation of adsorbed formamide from intercalated formamide in formamide-intercalated kaolinites, is reported. The Raman spectra of the CRTA-treated formamide-intercalated kaolinites are significantly different from those of the intercalated kaolinites, which display a combination of both intercalated and adsorbed formamide. An intense band is observed at 3629 cm-1, attributed to the inner surface hydroxyls hydrogen bonded to the formamide. Broad bands are observed at 3600 and 3639 cm-1, assigned to the inner surface hydroxyls, which are hydrogen bonded to the adsorbed water molecules. The hydroxyl-stretching band of the inner hydroxyl is observed at 3621 cm-1 in the Raman spectra of the CRTA-treated formamide-intercalated kaolinites. The results of thermal analysis show that the amount of intercalated formamide between the kaolinite layers is independent of the presence of water. Significant differences are observed in the CO stretching region between the adsorbed and intercalated formamide.