18 resultados para Nonparametric discriminant analysis
Resumo:
This research was undertaken to explore dimensions of the risk construct, identify factors related to risk-taking in education, and study risk propensity among employees at a community college. Risk-taking propensity (RTP) was measured by the 12-item BCDQ, which consisted of personal and professional risk-related situations balanced for the money, reputation, and satisfaction dimensions of the risk construct. Scoring ranged from 1.00 (most cautious) to 6.00 (most risky). Surveys including the BCDQ and seven demographic questions relating to age, gender, professional status, length of service, academic discipline, highest degree, and campus location were sent to faculty, administrators, and academic department heads. A total of 325 surveys were returned, resulting in a 66.7% response rate. Subjects were relatively homogeneous for age, length of service, and highest degree. Subjects were also homogeneous for risk-taking propensity: no substantive differences in RTP scores were noted within and among demographic groups, with the possible exception of academic discipline. The mean RTP score for all subjects was 3.77, for faculty was 3.76, for administrators was 3.83, and for department heads was 3.64. The relationship between propensity to take personal risks and propensity to take professional risks was tested by computing Pearson r correlation coefficients. The relationships for the total sample, faculty, and administrator groups were statistically significant, but of limited practical significance. Subjects were placed into risk categories by dividing the response scale into thirds. A 3 X 3 factorial ANOVA revealed no interaction effects between professional status and risk category with regard to RTP score. A discriminant analysis showed that a seven-factor model was not effective in predicting risk category. The homogeneity of the study sample and the effect of a risk encouraging environment were discussed in the context of the community college. Since very little data on risk-taking in education is available, risk propensity data from this study could serve as a basis for comparison to future research. Results could be used by institutions to plan professional development activities, designed to increase risk-taking and encourage active acceptance of change.
Resumo:
A purpose of this research study was to demonstrate the practical linguistic study and evaluation of dissertations by using two examples of the latest technology, the microcomputer and optical scanner. That involved developing efficient methods for data entry plus creating computer algorithms appropriate for personal, linguistic studies. The goal was to develop a prototype investigation which demonstrated practical solutions for maximizing the linguistic potential of the dissertation data base. The mode of text entry was from a Dest PC Scan 1000 Optical Scanner. The function of the optical scanner was to copy the complete stack of educational dissertations from the Florida Atlantic University Library into an I.B.M. XT microcomputer. The optical scanner demonstrated its practical value by copying 15,900 pages of dissertation text directly into the microcomputer. A total of 199 dissertations or 72% of the entire stack of education dissertations (277) were successfully copied into the microcomputer's word processor where each dissertation was analyzed for a variety of syntax frequencies. The results of the study demonstrated the practical use of the optical scanner for data entry, the microcomputer for data and statistical analysis, and the availability of the college library as a natural setting for text studies. A supplemental benefit was the establishment of a computerized dissertation corpus which could be used for future research and study. The final step was to build a linguistic model of the differences in dissertation writing styles by creating 7 factors from 55 dependent variables through principal components factor analysis. The 7 factors (textual components) were then named and described on a hypothetical construct defined as a continuum from a conversational, interactional style to a formal, academic writing style. The 7 factors were then grouped through discriminant analysis to create discriminant functions for each of the 7 independent variables. The results indicated that a conversational, interactional writing style was associated with more recent dissertations (1972-1987), an increase in author's age, females, and the department of Curriculum and Instruction. A formal, academic writing style was associated with older dissertations (1972-1987), younger authors, males, and the department of Administration and Supervision. It was concluded that there were no significant differences in writing style due to subject matter (community college studies) compared to other subject matter. It was also concluded that there were no significant differences in writing style due to the location of dissertation origin (Florida Atlantic University, University of Central Florida, Florida International University).
Resumo:
Prices of U.S. Treasury securities vary over time and across maturities. When the market in Treasurys is sufficiently complete and frictionless, these prices may be modeled by a function time and maturity. A cross-section of this function for time held fixed is called the yield curve; the aggregate of these sections is the evolution of the yield curve. This dissertation studies aspects of this evolution. ^ There are two complementary approaches to the study of yield curve evolution here. The first is principal components analysis; the second is wavelet analysis. In both approaches both the time and maturity variables are discretized. In principal components analysis the vectors of yield curve shifts are viewed as observations of a multivariate normal distribution. The resulting covariance matrix is diagonalized; the resulting eigenvalues and eigenvectors (the principal components) are used to draw inferences about the yield curve evolution. ^ In wavelet analysis, the vectors of shifts are resolved into hierarchies of localized fundamental shifts (wavelets) that leave specified global properties invariant (average change and duration change). The hierarchies relate to the degree of localization with movements restricted to a single maturity at the base and general movements at the apex. Second generation wavelet techniques allow better adaptation of the model to economic observables. Statistically, the wavelet approach is inherently nonparametric while the wavelets themselves are better adapted to describing a complete market. ^ Principal components analysis provides information on the dimension of the yield curve process. While there is no clear demarkation between operative factors and noise, the top six principal components pick up 99% of total interest rate variation 95% of the time. An economically justified basis of this process is hard to find; for example a simple linear model will not suffice for the first principal component and the shape of this component is nonstationary. ^ Wavelet analysis works more directly with yield curve observations than principal components analysis. In fact the complete process from bond data to multiresolution is presented, including the dedicated Perl programs and the details of the portfolio metrics and specially adapted wavelet construction. The result is more robust statistics which provide balance to the more fragile principal components analysis. ^