48 resultados para Digital analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Land use and transportation interaction has been a research topic for several decades. There have been efforts to identify impacts of transportation on land use from several different perspectives. One focus has been the role of transportation improvements in encouraging new land developments or relocation of activities due to improved accessibility. The impacts studied have included property values and increased development. Another focus has been on the changes in travel behavior due to better mobility and accessibility. Most studies to date have been conducted in metropolitan level, thus unable to account for interactions spatially and temporally at smaller geographic scales. ^ In this study, a framework for studying the temporal interactions between transportation and land use was proposed and applied to three selected corridor areas in Miami-Dade County, Florida. The framework consists of two parts: one is developing of temporal data and the other is applying time series analysis to this temporal data to identify their dynamic interactions. Temporal GIS databases were constructed and used to compile building permit data and transportation improvement projects. Two types of time series analysis approaches were utilized: univariate models and multivariate models. Time series analysis is designed to describe the dynamic consequences of time series by developing models and forecasting the future of the system based on historical trends. Model estimation results from the selected corridors were then compared. ^ It was found that the time series models predicted residential development better than commercial development. It was also found that results from three study corridors varied in terms of the magnitude of impacts, length of lags, significance of the variables, and the model structure. Long-run effect or cumulated impact of transportation improvement on land developments was also measured with time series techniques. The study offered evidence that congestion negatively impacted development and transportation investments encouraged land development. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the study was to examine the relationship between teacher beliefs and actual classroom practice in early literacy instruction. Conjoint analysis was used to measure teachers' beliefs on four early literacy factors—phonological awareness, print awareness, graphophonic awareness, and structural awareness. A collective case study format was then used to measure the correspondence of teachers' beliefs with their actual classroom practice. ^ Ninety Project READS participants were given twelve cards in an orthogonal experimental design describing students that either met or did not meet criteria on the four early literacy factors. Conjoint measurements of whether the student is an efficient reader were taken. These measurements provided relative importance scores for each respondent. Based on the relative important scores, four teachers were chosen to participate in a collective case study. ^ The conjoint results enabled the clustering of teachers into four distinct groups, each aligned with one of the four early literacy factors. K-means cluster analysis of the relative importance measurements showed commonalities among the ninety respondents' beliefs. The collective case study results were mixed. Implications for researchers and practitioners include the use of conjoint analysis in measuring teacher beliefs on the four early literacy factors. Further, the understanding of teacher preferences on these beliefs may assist in the development of curriculum design and therefore increase educational effectiveness. Finally, comparisons between teachers' beliefs on the four early literacy factors and actual instructional practices may facilitate teacher self-reflection thus encouraging positive teacher change. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prices of U.S. Treasury securities vary over time and across maturities. When the market in Treasurys is sufficiently complete and frictionless, these prices may be modeled by a function time and maturity. A cross-section of this function for time held fixed is called the yield curve; the aggregate of these sections is the evolution of the yield curve. This dissertation studies aspects of this evolution. ^ There are two complementary approaches to the study of yield curve evolution here. The first is principal components analysis; the second is wavelet analysis. In both approaches both the time and maturity variables are discretized. In principal components analysis the vectors of yield curve shifts are viewed as observations of a multivariate normal distribution. The resulting covariance matrix is diagonalized; the resulting eigenvalues and eigenvectors (the principal components) are used to draw inferences about the yield curve evolution. ^ In wavelet analysis, the vectors of shifts are resolved into hierarchies of localized fundamental shifts (wavelets) that leave specified global properties invariant (average change and duration change). The hierarchies relate to the degree of localization with movements restricted to a single maturity at the base and general movements at the apex. Second generation wavelet techniques allow better adaptation of the model to economic observables. Statistically, the wavelet approach is inherently nonparametric while the wavelets themselves are better adapted to describing a complete market. ^ Principal components analysis provides information on the dimension of the yield curve process. While there is no clear demarkation between operative factors and noise, the top six principal components pick up 99% of total interest rate variation 95% of the time. An economically justified basis of this process is hard to find; for example a simple linear model will not suffice for the first principal component and the shape of this component is nonstationary. ^ Wavelet analysis works more directly with yield curve observations than principal components analysis. In fact the complete process from bond data to multiresolution is presented, including the dedicated Perl programs and the details of the portfolio metrics and specially adapted wavelet construction. The result is more robust statistics which provide balance to the more fragile principal components analysis. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Small Arms and Light Weapons (SALW) proliferation was undertaken by the Non-Governmental Organizations (NGOs) as the next important issue in international relations after the success of the International Campaign to Ban Landmines (ICBL). This dissertation focuses on the reasons why the issue of SALW resulted in an Action Program rather than an international convention. Thus, this result was considered as unsuccessful by the advocates of regulating the illicit trade in SALW. The study provides a social movement theoretical approach, using framing, political opportunity and network analysis to explain why the advocates of regulating the illicit trade in SALW did no succeed in their goals. The UN is taken as the arena in which NGOs, States and International Governmental Organizations (IGOs) discussed the illicit trade in SALW. ^ The findings of the study indicate that the political opportunity for the issue of SALW was not ideal. The network of NGOs, States and IGOs was not strong. The NGOs advocating regulation of SALW were divided over the approach of the issue and were part of different coalitions with differing objectives. Despite initial widespread interest among States, only a couple of States were fully committed to the issue till the end. The regional IGOs approached the issue based on their regional priorities and were less interested in an international covenant. The advocates of regulating illicit trade in SALW attempted to frame SALW as a humanitarian issue rather than as a security issue. Thus they were not able to use frame alignment to convince states to treat SALW as a humanitarian issue. In conclusion it can be said that all three items, framing, political opportunity and the network, play a role in the lack of success of advocates for regulating the illicit trade in SALW. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation addressed two broad problems in international macroeconomics and conflict analysis. The first problem in the first chapter looked at the behavior of exchange rate and its interaction with industry-level tradable goods prices for three countries, USA, UK and Japan. This question has important monetary policy implications. Here, I computed to what extent changes in exchange rate affected prices of consumer, producer, and export goods. I also studied the timing of these changes in these prices. My results, based on thirty-four industrial prices for USA, UK and Japan, supported the view that changes in exchange rates significantly affect prices of industrial and consumer goods. It also provided an insight to the underlying economic process that led to changes in relative prices. ^ In the second chapter, I explored the predictability of future inflation by incorporating shocks to exchange rates and clearly specified the transmission mechanisms that link exchange rates to industry-level consumer and producer prices. Employing a variety of linear and state-of-the-art nonlinear models, I also predicted growth rates of future prices. Comparing levels of inflation obtained from the above approaches showed superiority of the structural model incorporating the exchange rate pass-through effect. ^ The second broad issue addressed in the third chapter of the dissertation investigated the economic motives for conflict, manifested by rebellion and civil war for seventeen Latin American countries. Based on the analytical framework of Garfinkel, Skaperdas and Syropoulos (2004), I employed ordinal regressions and Markov switching for a panel of seventeen countries to identify trade and openness factors responsible for conflict occurrence and intensity. The results suggested that increased trade openness reduced high intensity domestic conflicts but overdependence on agricultural exports, along with a lack of income earning opportunities lead to more conflicts. Thereafter, using the Cox Proportional Hazard model I studied “conflict duration” and found that over-reliance on agricultural exports explained a major part of the length of conflicts in addition to various socio-political factors. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rate of fatal crashes in Florida has remained significantly higher than the national average for the last several years. The 2003 statistics from the National Highway Traffic Safety Administration (NHTSA), the latest available, show a fatality rate in Florida of 1.71 per 100 million vehicle-miles traveled compared to the national average of 1.48 per 100 million vehicle-miles traveled. The objective of this research is to better understand the driver, environmental, and roadway factors that affect the probability of injury severity in Florida. ^ In this research, the ordered logit model was used to develop six injury severity models; single-vehicle and two-vehicle crashes on urban freeways and urban principal arterials and two-vehicle crashes at urban signalized and unsignalized intersections. The data used in this research included all crashes that occurred on the state highway system for the period from 2001 to 2003 in the Southeast Florida region, which includes the Miami-Dade, Broward and Palm Beach Counties.^ The results of the analysis indicate that the age group and gender of the driver at fault were significant factors of injury severity risk across all models. The greatest risk of severe injury was observed for the age groups 55 to 65 and 66 and older. A positive association between injury severity and the race of the driver at fault was also found. Driver at fault of Hispanic origin was associated with a higher risk of severe injury for both freeway models and for the two-vehicle crash model on arterial roads. A higher risk of more severe injury crash involvement was also found when an African-American was the at fault driver on two-vehicle crashes on freeways. In addition, the arterial class was also found to be positively associated with a higher risk of severe crashes. Six-lane divided arterials exhibited the highest injury severity risk of all arterial classes. The lowest severe injury risk was found for one way roads. Alcohol involvement by the driver at fault was also found to be a significant risk of severe injury for the single-vehicle crash model on freeways. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation develops a new figure of merit to measure the similarity (or dissimilarity) of Gaussian distributions through a novel concept that relates the Fisher distance to the percentage of data overlap. The derivations are expanded to provide a generalized mathematical platform for determining an optimal separating boundary of Gaussian distributions in multiple dimensions. Real-world data used for implementation and in carrying out feasibility studies were provided by Beckman-Coulter. It is noted that although the data used is flow cytometric in nature, the mathematics are general in their derivation to include other types of data as long as their statistical behavior approximate Gaussian distributions. ^ Because this new figure of merit is heavily based on the statistical nature of the data, a new filtering technique is introduced to accommodate for the accumulation process involved with histogram data. When data is accumulated into a frequency histogram, the data is inherently smoothed in a linear fashion, since an averaging effect is taking place as the histogram is generated. This new filtering scheme addresses data that is accumulated in the uneven resolution of the channels of the frequency histogram. ^ The qualitative interpretation of flow cytometric data is currently a time consuming and imprecise method for evaluating histogram data. This method offers a broader spectrum of capabilities in the analysis of histograms, since the figure of merit derived in this dissertation integrates within its mathematics both a measure of similarity and the percentage of overlap between the distributions under analysis. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Historical accuracy is only one of the components of a scholarly college textbook used to teach the history of jazz music. Textbooks in this field should include accurate ethnic representation of the most important musical figures as jazz is considered the only original American art form. As college and universities celebrate diversity, it is important that jazz history be accurate and complete. ^ The purpose of this study was to examine the content of the most commonly used jazz history textbooks currently used at American colleges and universities. This qualitative study utilized grounded and textual analysis to explore the existence of ethnic representation in these texts. The methods used were modeled after the work of Kane and Selden each of whom conducted a content analysis focused on a limited field of study. This study is focused on key jazz artists and composers whose work was created in the periods of early jazz (1915-1930), swing (1930-1945) and modern jazz (1945-1960). ^ This study considered jazz notables within the texts in terms of ethnic representation, authors' use of language, contributions to the jazz canon, and place in the standard jazz repertoire. Appropriate historical sections of the selected texts were reviewed and coded using predetermined rubrics. Data were then aggregated into categories and then analyzed according to the character assigned to the key jazz personalities noted in the text as well as the comparative standing afforded each personality. ^ The results of this study demonstrate that particular key African-American jazz artists and composers occupy a significant place in these texts while other significant individuals representing other ethnic groups are consistently overlooked. This finding suggests that while America and the world celebrates the quality of the product of American jazz as great musically and significant socially, many ethnic contributors are not mentioned with the result being a less than complete picture of the evolution of this American art form. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation is a study of customer relationship management theory and practice. Customer Relationship Management (CRM) is a business strategy whereby companies build strong relationships with existing and prospective customers with the goal of increasing organizational profitability. It is also a learning process involving managing change in processes, people, and technology. CRM implementation and its ramifications are also not completely understood as evidenced by the high number of failures in CRM implementation in organizations and the resulting disappointments. ^ The goal of this dissertation is to study emerging issues and trends in CRM, including the effect of computer software and the accompanying new management processes on organizations, and the dynamics of the alignment of marketing, sales and services, and all other functions responsible for delivering customers a satisfying experience. ^ In order to understand CRM better a content analysis of more than a hundred articles and documents from academic and industry sources was undertaken using a new methodological twist to the traditional method. An Internet domain name (http://crm.fiu.edu) was created for the purpose of this research by uploading an initial one hundred plus abstracts of articles and documents onto it to form a knowledge database. Once the database was formed a search engine was developed to enable the search of abstracts using relevant CRM keywords to reveal emergent dominant CRM topics. The ultimate aim of this website is to serve as an information hub for CRM research, as well as a search engine where interested parties can enter CRM-relevant keywords or phrases to access abstracts, as well as submit abstracts to enrich the knowledge hub. ^ Research questions were investigated and answered by content analyzing the interpretation and discussion of dominant CRM topics and then amalgamating the findings. This was supported by comparisons within and across individual, paired, and sets-of-three occurrences of CRM keywords in the article abstracts. ^ Results show that there is a lack of holistic thinking and discussion of CRM in both academics and industry which is required to understand how the people, process, and technology in CRM impact each other to affect successful implementation. Industry has to get their heads around CRM and holistically understand how these important dimensions affect each other. Only then will organizational learning occur, and overtime result in superior processes leading to strong profitable customer relationships and a hard to imitate competitive advantage. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software architecture is the abstract design of a software system. It plays a key role as a bridge between requirements and implementation, and is a blueprint for development. The architecture represents a set of early design decisions that are crucial to a system. Mistakes in those decisions are very costly if they remain undetected until the system is implemented and deployed. This is where formal specification and analysis fits in. Formal specification makes sure that an architecture design is represented in a rigorous and unambiguous way. Furthermore, a formally specified model allows the use of different analysis techniques for verifying the correctness of those crucial design decisions. ^ This dissertation presented a framework, called SAM, for formal specification and analysis of software architectures. In terms of specification, formalisms and mechanisms were identified and chosen to specify software architecture based on different analysis needs. Formalisms for specifying properties were also explored, especially in the case of non-functional properties. In terms of analysis, the dissertation explored both the verification of functional properties and the evaluation of non-functional properties of software architecture. For the verification of functional property, methodologies were presented on how to apply existing model checking techniques on a SAM model. For the evaluation of non-functional properties, the dissertation first showed how to incorporate stochastic information into a SAM model, and then explained how to translate the model to existing tools and conducts the analysis using those tools. ^ To alleviate the analysis work, we also provided a tool to automatically translate a SAM model for model checking. All the techniques and methods described in the dissertation were illustrated by examples or case studies, which also served a purpose of advocating the use of formal methods in practice. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation introduces an integrated algorithm for a new application dedicated at discriminating between electrodes leading to a seizure onset and those that do not, using interictal subdural EEG data. The significance of this study is in determining among all of these channels, all containing interictal spikes, why some electrodes eventually lead to seizure while others do not. A first finding in the development process of the algorithm is that these interictal spikes had to be asynchronous and should be located in different regions of the brain, before any consequential interpretations of EEG behavioral patterns are possible. A singular merit of the proposed approach is that even when the EEG data is randomly selected (independent of the onset of seizure), we are able to classify those channels that lead to seizure from those that do not. It is also revealed that the region of ictal activity does not necessarily evolve from the tissue located at the channels that present interictal activity, as commonly believed.^ The study is also significant in terms of correlating clinical features of EEG with the patient's source of ictal activity, which is coming from a specific subset of channels that present interictal activity. The contributions of this dissertation emanate from (a) the choice made on the discriminating parameters used in the implementation, (b) the unique feature space that was used to optimize the delineation process of these two type of electrodes, (c) the development of back-propagation neural network that automated the decision making process, and (d) the establishment of mathematical functions that elicited the reasons for this delineation process. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examined university students' writing skills as perceived by university students and their English instructors. The goal of the study was to provide English instructors with objective, quantified information about writing perceptions from both the students' and instructors' viewpoints. ^ A survey instrument was developed based on a survey instrument created by Newkirk, Cameron, and Selfe (1977) to identify instructors' perceived knowledge of student writing skills. The present study used a descriptive statistical design. It examined five writing skill areas: attitude, content, grammar and mechanics, literary considerations, and the writing process through a questionnaire completed by a convenience sample of summer and fall admitted freshmen who were enrolled in Essay Writing and Freshman Composition courses and English Department instructors at a large South Florida public university. ^ The study consisted of five phases. The first phase was modifying of the Newkirk, Cameron, and Selfe (1977) questionnaire. Two versions of the revised survey were developed - one for instructors and one for students. The second phase was pilot testing the questionnaire for evaluation of administration and scoring. The third phase was administering the questionnaire to 1,280 students and 48 instructors. The fourth phase was analyzing the data. The study found a significant difference in the perceptions of students and instructors in all areas of writing skills examined by the survey. Responses to 29 of 30 questions showed that students felt they had better attitudes toward writing and better writing skills than instructors thought. ^ The final phase was developing recommendations for practice. Based on findings and theory and empirical evidence drawn from the fields of adult education and composition research, learner-centered, self-directed curriculum guidelines are offered. ^ By objectively quantifying student and instructor perceptions of students' writing skills, this study contributes to a growing body of literature that: (a) encourages instructors to acknowledge the perception disparities between instructors and students; (b) gives instructors a better understanding of how to communicate with students; and (c) recommends the development of new curriculum, placement tests, and courses that meet the needs of students and enables English instructors to provide meaningful instruction. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The microarray technology provides a high-throughput technique to study gene expression. Microarrays can help us diagnose different types of cancers, understand biological processes, assess host responses to drugs and pathogens, find markers for specific diseases, and much more. Microarray experiments generate large amounts of data. Thus, effective data processing and analysis are critical for making reliable inferences from the data. ^ The first part of dissertation addresses the problem of finding an optimal set of genes (biomarkers) to classify a set of samples as diseased or normal. Three statistical gene selection methods (GS, GS-NR, and GS-PCA) were developed to identify a set of genes that best differentiate between samples. A comparative study on different classification tools was performed and the best combinations of gene selection and classifiers for multi-class cancer classification were identified. For most of the benchmarking cancer data sets, the gene selection method proposed in this dissertation, GS, outperformed other gene selection methods. The classifiers based on Random Forests, neural network ensembles, and K-nearest neighbor (KNN) showed consistently god performance. A striking commonality among these classifiers is that they all use a committee-based approach, suggesting that ensemble classification methods are superior. ^ The same biological problem may be studied at different research labs and/or performed using different lab protocols or samples. In such situations, it is important to combine results from these efforts. The second part of the dissertation addresses the problem of pooling the results from different independent experiments to obtain improved results. Four statistical pooling techniques (Fisher inverse chi-square method, Logit method. Stouffer's Z transform method, and Liptak-Stouffer weighted Z-method) were investigated in this dissertation. These pooling techniques were applied to the problem of identifying cell cycle-regulated genes in two different yeast species. As a result, improved sets of cell cycle-regulated genes were identified. The last part of dissertation explores the effectiveness of wavelet data transforms for the task of clustering. Discrete wavelet transforms, with an appropriate choice of wavelet bases, were shown to be effective in producing clusters that were biologically more meaningful. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to gain a greater understanding of the relationship between athletic success of football and men's basketball and the U.S. News and World Report (USNWR) college rankings. There has been consistent debate among researchers who study institutional quality about whether intercollegiate athletics enhances reputation. This study is similar to other studies attempting to measure the relationship between athletic success and possible indirect benefits to the university from athletics, such as increased admissions applications and increased alumni donations and giving. This study offered a more nuanced model for measuring athletic success, a concept that has been difficult to measure quantitatively. The method used here also measured change over time (in this case, from year-to-year over an eleven year period). The research questions for this study were (a) is there a correlation between athletic success and the USNWR college ranking; and (b) is there a correlation in the change from year-to-year in athletic success with the change from year-to-year in the USNWR college rankings? Spearman Rho correlation and ANOVA tests were used to answer these research questions. The results from the statistical tests demonstrated little correlation between athletic success, whether in football or men's basketball, with the USNWR college rankings. Although the relationships were weak, men's basketball success consistently demonstrated a stronger relationship than football success. This finding differed from what is most often found in the literature, which often favors football success. The ANOVA test results did reveal some results that suggest athletic participation is a factor in the USNWR college rankings. As the debate continues about whether intercollegiate athletics enhances reputation, and as colleges and universities continue spending enormously on athletics, a keener understanding about the possible indirect benefits to the university from athletic programs is needed. The "advertising" provided by spectator sports such as football and men's basketball is often assumed by university leaders to present substantial indirect benefits for the university. However, the existing research along with this study provides little evidence of such opportunities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the remediation of burial grounds at the US Department of Energy's (DOE's) Hanford Site in Washington State, the dispersion of contaminated soil particles and dust is an issue that is faced by site workers on a daily basis. This contamination problem is even more of a concern when one takes into account the semi-arid characteristics of the region where the site is located. To mitigate this problem, workers at the site use a variety of engineered methods to minimize the dispersion of contaminated soil and dust (i.e. use of water and/or suppression agents that stabilizes the soil prior to soil excavation, segregation, and removal activities). A primary contributor to the dispersion of contaminated soil and dust is wind soil erosion. The erosion process occurs when the wind speed exceeds a certain threshold value which depends on a number of factors including wind force loading, particle size, surface soil moisture, and the geometry of the soil. Thus under these circumstances, the mobility of contaminated soil and generation and dispersion of particulate matter are significantly influenced by these parameters. This dependence of soil and dust movement on threshold shear velocity, fixative dilution and/or application rates, soil moisture content, and soil geometry were studied for Hanford's sandy soil through a series of wind tunnel experiments, laboratory experiments and theoretical analysis. In addition, the behavior of plutonium (Pu) powder contamination in the soil was studied by introducing a Pu simulant (cerium oxide). The results showed that soil dispersion and PM10 concentrations decreased with increasing soil moisture. Also, it was shown that the mobility of the soil was affected by increasing wind velocity. It was demonstrated that the use of fixative products greatly decreased the amount of soil and PM10 concentrations when exposed to varying wind conditions. In addition, it was shown that geometry of the soil sample affected the velocity profile and calculation of roughness surface coefficient when comparing round and flat soil samples. Finally, threshold shear velocities were calculated for soil with flat surface and their dependency on surface soil moisture was demonstrated. A theoretical framework was developed to explain these dependencies.