15 resultados para Weighted average power tests
em Digital Commons at Florida International University
Resumo:
Orthogonal Frequency-Division Multiplexing (OFDM) has been proved to be a promising technology that enables the transmission of higher data rate. Multicarrier Code-Division Multiple Access (MC-CDMA) is a transmission technique which combines the advantages of both OFDM and Code-Division Multiplexing Access (CDMA), so as to allow high transmission rates over severe time-dispersive multi-path channels without the need of a complex receiver implementation. Also MC-CDMA exploits frequency diversity via the different subcarriers, and therefore allows the high code rates systems to achieve good Bit Error Rate (BER) performances. Furthermore, the spreading in the frequency domain makes the time synchronization requirement much lower than traditional direct sequence CDMA schemes. There are still some problems when we use MC-CDMA. One is the high Peak-to-Average Power Ratio (PAPR) of the transmit signal. High PAPR leads to nonlinear distortion of the amplifier and results in inter-carrier self-interference plus out-of-band radiation. On the other hand, suppressing the Multiple Access Interference (MAI) is another crucial problem in the MC-CDMA system. Imperfect cross-correlation characteristics of the spreading codes and the multipath fading destroy the orthogonality among the users, and then cause MAI, which produces serious BER degradation in the system. Moreover, in uplink system the received signals at a base station are always asynchronous. This also destroys the orthogonality among the users, and hence, generates MAI which degrades the system performance. Besides those two problems, the interference should always be considered seriously for any communication system. In this dissertation, we design a novel MC-CDMA system, which has low PAPR and mitigated MAI. The new Semi-blind channel estimation and multi-user data detection based on Parallel Interference Cancellation (PIC) have been applied in the system. The Low Density Parity Codes (LDPC) has also been introduced into the system to improve the performance. Different interference models are analyzed in multi-carrier communication systems and then the effective interference suppression for MC-CDMA systems is employed in this dissertation. The experimental results indicate that our system not only significantly reduces the PAPR and MAI but also effectively suppresses the outside interference with low complexity. Finally, we present a practical cognitive application of the proposed system over the software defined radio platform.
Resumo:
Orthogonal Frequency-Division Multiplexing (OFDM) has been proved to be a promising technology that enables the transmission of higher data rate. Multicarrier Code-Division Multiple Access (MC-CDMA) is a transmission technique which combines the advantages of both OFDM and Code-Division Multiplexing Access (CDMA), so as to allow high transmission rates over severe time-dispersive multi-path channels without the need of a complex receiver implementation. Also MC-CDMA exploits frequency diversity via the different subcarriers, and therefore allows the high code rates systems to achieve good Bit Error Rate (BER) performances. Furthermore, the spreading in the frequency domain makes the time synchronization requirement much lower than traditional direct sequence CDMA schemes. There are still some problems when we use MC-CDMA. One is the high Peak-to-Average Power Ratio (PAPR) of the transmit signal. High PAPR leads to nonlinear distortion of the amplifier and results in inter-carrier self-interference plus out-of-band radiation. On the other hand, suppressing the Multiple Access Interference (MAI) is another crucial problem in the MC-CDMA system. Imperfect cross-correlation characteristics of the spreading codes and the multipath fading destroy the orthogonality among the users, and then cause MAI, which produces serious BER degradation in the system. Moreover, in uplink system the received signals at a base station are always asynchronous. This also destroys the orthogonality among the users, and hence, generates MAI which degrades the system performance. Besides those two problems, the interference should always be considered seriously for any communication system. In this dissertation, we design a novel MC-CDMA system, which has low PAPR and mitigated MAI. The new Semi-blind channel estimation and multi-user data detection based on Parallel Interference Cancellation (PIC) have been applied in the system. The Low Density Parity Codes (LDPC) has also been introduced into the system to improve the performance. Different interference models are analyzed in multi-carrier communication systems and then the effective interference suppression for MC-CDMA systems is employed in this dissertation. The experimental results indicate that our system not only significantly reduces the PAPR and MAI but also effectively suppresses the outside interference with low complexity. Finally, we present a practical cognitive application of the proposed system over the software defined radio platform.
Resumo:
In his study - Evaluating and Selecting a Property Management System - by Galen Collins, Assistant Professor, School of Hotel and Restaurant Management, Northern Arizona University, Assistant Professor Collins states briefly at the outset: “Computerizing a property requires a game plan. Many have selected a Property Management System without much forethought and have been unhappy with the final results. The author discusses the major factors that must be taken into consideration in the selection of a PMS, based on his personal experience.” Although, this article was written in the year 1988 and some information contained may be dated, there are many salient points to consider. “Technological advances have encouraged many hospitality operators to rethink how information should be processed, stored, retrieved, and analyzed,” offers Collins. “Research has led to the implementation of various cost-effective applications addressing almost every phase of operations,” he says in introducing the computer technology germane to many PMS functions. Professor Collins talks about the Request for Proposal, its conditions and its relevance in negotiating a PMS system. The author also wants the system buyer to be aware [not necessarily beware] of vendor recommendations, and not to rely solely on them. Exercising forethought will help in avoiding the drawback of purchasing an inadequate PMS system. Remember, the vendor is there first and foremost to sell you a system. This doesn’t necessarily mean that the adjectives unreliable and unethical are on the table, but do be advised. Professor Collins presents a graphic outline for the Weighted Average Approach to Scoring Vendor Evaluations. Among the elements to be considered in evaluating a PMS system, and there are several analyzed in this essay, Professor Collins advises that a perspective buyer not overlook the service factor when choosing a PMS system. Service is an important element to contemplate. “In a hotel environment, the special emphasis should be on service. System downtime can be costly and aggravating and will happen periodically,” Collins warns. Professor Collins also examines the topic of PMS system environment; of which the importance of such a factor should not be underestimated. “The design of the computer system should be based on the physical layout of the property and the projected workloads. The heart of the system, housed in a protected, isolated area, can support work stations strategically located throughout the property,” Professor Collins provides. A Property Profile Description is outlined in Table 1. The author would also point out that ease-of-operation is another significant factor to think about. “A user-friendly software package allows the user to easily move through the program without encountering frustrating obstacles,” says Collins. “Programs that require users to memorize abstract abbreviations, codes, and information to carry out standard routines should be avoided,” he counsels.
Resumo:
This dissertation examines the effect of regulations, resource and referral agencies, and subsidies on price and quality of care in child care centers. This research is based on a carefully developed conceptual framework that incorporates the factors affecting the demand and supply of child care. The first step in developing this framework is sketching out the structural equations. The structural equations help us understand the underlying behavior of individuals and firms making a decision. The exogenous variables are vector of attributes relating to family characteristics, child characteristics, regulations, subsidy, community characteristics and prices of inputs. Based on the structural equations, reduced form equations are estimated to find the effect of each of the exogenous variables on each of the endogenous variables. Reduced form equations help us answer public policy questions. The sample for this study is from the 1990 Profile of Child Care Settings (PCCS) data in which 2,089 center based programs were interviewed.^ Child/Staff Ratio (Group Level). Results indicate that among subsidies, only the state subsidy per child in poverty has a significant effect on the child/staff ratio at the group level. Presence of resource and referral agencies also increase the child/staff ratio at the group level. Also when the maximum center group size regulation for 25-36 months becomes more stringent, the child/staff ratio at the group level decreases.^ Child/Staff Ratio (Center Level). When the regulations for the maximum child/staff ratio for age groups 13-24 months and 37-60 months become lax, the child/staff ratio for the center increases. As the regulation for maximum group size for infants becomes stringent, the child/staff ratio decreases. An interesting finding is that as the regulations for maximum group size for age groups 13-24 months and 25-36 months become stringent, the child/staff ratio for the center increases. Another significant finding is that when a center is located in a rural area the child/staff ratio is significantly lower.^ Center Weighted Average Hourly Fees. Maximum group size regulations for age groups 25-36 months and 37-60 months have a negative effect on center hourly fee. Maximum child staff regulations for age groups 13-24 months and 37-60 months have a negative effect on center hourly fee. Maximum child staff regulations for age groups 0-12 months and 25-36 months have a positive effect on center hourly fee. Findings also indicate that the center average hourly price is lower when there is a resource and referral agency present. Cost adjusted prekindergarten funds and JOBS child care subsidies have a negative effect on average hourly fee. Cost adjusted social services block grant and state subsidy per child in poverty have a positive effect on the average hourly price. A major finding of this dissertation is the interaction of subsidy and regulatory variables.^ Another major finding is that child/staff ratio at the group level is lower when there is an interaction between geographic location and nature of center sponsorship. ^
Resumo:
This dissertation examines the effect of regulations, resource and referral agencies, and subsidies on price and quality of care in child care centers. This research is based on a carefully developed conceptual framework that incorporates the factors affecting the demand and supply of child care. The first step in developing this framework is sketching out the structural equations. The structural equations help us understand the underlying behavior of individuals and firms making a decision. The exogenous variables are vector of attributes relating to family characteristics, child characteristics, regulations, subsidy, community characteristics and prices of inputs. Based on the structural equations, reduced form equations are estimated to find the effect of each of the exogenous variables on each of the endogenous variables. Reduced form equations help us answer public policy questions. The sample for this study is from the 1990 Profile of Child Care Settings (PCCS) data in which 2,089 center based programs were interviewed. Child/Staff Ratio (Group Level): Results indicate that among subsidies, only the state subsidy per child in poverty has a significant effect on the child/staff ratio at the group level. Presence of resource and referral agencies also increase the child/staff ratio at the group level. Also when the maximum center group size regulation for 25-36 months becomes more stringent, the child/staff ratio at the group level decreases. Child/Staff Ratio (Center Level): When the regulations for the maximum child/staff ratio for age groups 13-24 months and 37-60 months become lax, the child/staff ratio for the center increases. As the regulation for maximum group size for infants becomes stringent, the child/staff ratio decreases. An interesting finding is that as the regulations for maximum group size for age groups 13-24 months and 25-36 months become stringent, the child/staff ratio for the center increases. Another significant finding is that when a center is located in a rural area the child/staff ratio is significantly lower. Center Weighted Average Hourly Fees: Maximum group size regulations for age groups 25-36 months and 37-60 months have a negative effect on center hourly fee. Maximum child staff regulations for age groups 13-24 months and 37-60 months have a negative effect on center hourly fee. Maximum child staff regulations for age groups 0-12 months and 25-36 months have a positive effect on center hourly fee. Findings also indicate that the center average hourly price is lower when there is a resource and referral agency present. Cost adjusted prekindergarten funds and JOBS child care subsidies have a negative effect on average hourly fee. Cost adjusted social services block grant and state subsidy per child in poverty have a positive effect on the average hourly price. A major finding of this dissertation is the interaction of subsidy and regulatory variables. Another major finding is that child/staff ratio at the group level is lower when there is an interaction between geographic location and nature of center sponsorship.
Resumo:
Annual average daily traffic (AADT) is important information for many transportation planning, design, operation, and maintenance activities, as well as for the allocation of highway funds. Many studies have attempted AADT estimation using factor approach, regression analysis, time series, and artificial neural networks. However, these methods are unable to account for spatially variable influence of independent variables on the dependent variable even though it is well known that to many transportation problems, including AADT estimation, spatial context is important. ^ In this study, applications of geographically weighted regression (GWR) methods to estimating AADT were investigated. The GWR based methods considered the influence of correlations among the variables over space and the spatially non-stationarity of the variables. A GWR model allows different relationships between the dependent and independent variables to exist at different points in space. In other words, model parameters vary from location to location and the locally linear regression parameters at a point are affected more by observations near that point than observations further away. ^ The study area was Broward County, Florida. Broward County lies on the Atlantic coast between Palm Beach and Miami-Dade counties. In this study, a total of 67 variables were considered as potential AADT predictors, and six variables (lanes, speed, regional accessibility, direct access, density of roadway length, and density of seasonal household) were selected to develop the models. ^ To investigate the predictive powers of various AADT predictors over the space, the statistics including local r-square, local parameter estimates, and local errors were examined and mapped. The local variations in relationships among parameters were investigated, measured, and mapped to assess the usefulness of GWR methods. ^ The results indicated that the GWR models were able to better explain the variation in the data and to predict AADT with smaller errors than the ordinary linear regression models for the same dataset. Additionally, GWR was able to model the spatial non-stationarity in the data, i.e., the spatially varying relationship between AADT and predictors, which cannot be modeled in ordinary linear regression. ^
Resumo:
Homework has been a controversial issue in education for the past century. Research has been scarce and has yielded results at both ends of the spectrum. This study examined the relationship between homework performance (percent of homework completed and percent of homework correct), student characteristics (SAT-9 score, gender, ethnicity, and socio-economic status), perceptions, and challenges and academic achievement determined by the students' average score on weekly tests and their score on the FCAT NRT mathematics assessment. ^ The subjects for this study consisted of 143 students enrolled in Grade 3 at a suburban elementary school in Miami, Florida. Pearson's correlations were used to examine the associations of the predictor variables with average test scores and FCAT NRT scores. Additionally, simultaneous regression analyses were carried out to examine the influence of the predictor variables on each of the criterion variables. Hierarchical regression analyses were performed on the criterion variables from the predictor variables. ^ Homework performance was significantly correlated with average test score. Controlling for the other variables homework performance was highly related to average test score and FCAT NRT score. ^ This study lends support to the view that homework completion is highly related to student academic achievement at the lower elementary level. It is suggested that at the elementary level more consideration be given to the amount of homework completed by students and to utilize the information in formulating intervention strategies for student who may not be achieving at the appropriate levels. ^
Resumo:
This study examined the relationship between homework performance (percent of homework completed and percent of homework correct), student characteristics (Stanford Achievement Test score, gender, ethnicity, and socio-economic status), perceptions, and challenges and academic achievement determined by the students’ average score on weekly tests and their score on the Florida Comprehensive Assessment Test (FCAT) Norm Reference Test (NRT) mathematics assessment.
Resumo:
Gene-based tests of association are frequently applied to common SNPs (MAF>5%) as an alternative to single-marker tests. In this analysis we conduct a variety of simulation studies applied to five popular gene-based tests investigating general trends related to their performance in realistic situations. In particular, we focus on the impact of non-causal SNPs and a variety of LD structures on the behavior of these tests. Ultimately, we find that non-causal SNPs can significantly impact the power of all gene-based tests. On average, we find that the “noise” from 6–12 non-causal SNPs will cancel out the “signal” of one causal SNP across five popular gene-based tests. Furthermore, we find complex and differing behavior of the methods in the presence of LD within and between non-causal and causal SNPs. Ultimately, better approaches for a priori prioritization of potentially causal SNPs (e.g., predicting functionality of non-synonymous SNPs), application of these methods to sequenced or fully imputed datasets, and limited use of window-based methods for assigning inter-genic SNPs to genes will improve power. However, significant power loss from non-causal SNPs may remain unless alternative statistical approaches robust to the inclusion of non-causal SNPs are developed.
Resumo:
High efficiency of power converters placed between renewable energy sources and the utility grid is required to maximize the utilization of these sources. Power quality is another aspect that requires large passive elements (inductors, capacitors) to be placed between these sources and the grid. The main objective is to develop higher-level high frequency-based power converter system (HFPCS) that optimizes the use of hybrid renewable power injected into the power grid. The HFPCS provides high efficiency, reduced size of passive components, higher levels of power density realization, lower harmonic distortion, higher reliability, and lower cost. The dynamic modeling for each part in this system is developed, simulated and tested. The steady-state performance of the grid-connected hybrid power system with battery storage is analyzed. Various types of simulations were performed and a number of algorithms were developed and tested to verify the effectiveness of the power conversion topologies. A modified hysteresis-control strategy for the rectifier and the battery charging/discharging system was developed and implemented. A voltage oriented control (VOC) scheme was developed to control the energy injected into the grid. The developed HFPCS was compared experimentally with other currently available power converters. The developed HFPCS was employed inside a microgrid system infrastructure, connecting it to the power grid to verify its power transfer capabilities and grid connectivity. Grid connectivity tests verified these power transfer capabilities of the developed converter in addition to its ability of serving the load in a shared manner. In order to investigate the performance of the developed system, an experimental setup for the HF-based hybrid generation system was constructed. We designed a board containing a digital signal processor chip on which the developed control system was embedded. The board was fabricated and experimentally tested. The system's high precision requirements were verified. Each component of the system was built and tested separately, and then the whole system was connected and tested. The simulation and experimental results confirm the effectiveness of the developed converter system for grid-connected hybrid renewable energy systems as well as for hybrid electric vehicles and other industrial applications.
Resumo:
The purpose of this study was to explore the impact of the Florida State-mandated Basic Skills Exit Tests (BSET) on the effectiveness of remedial instruction programs to adequately serve the academically underprepared student population. The primary research question concerned whether the introduction of the BSET has resulted in remedial completers who are better prepared for college-level coursework. ^ This study consisted of an ex post facto research design to examine the impact of the BSET on student readiness for subsequent college-level coursework at Miami-Dade Community College. Two way analysis of variance was used to compare the performance of remedial and college-ready students before and after the introduction of the BSET requirement. Chi-square analysis was used to explore changes in the proportion of students completing and passing remedial courses. Finally, correlation analysis was used to explore the utility of the BSET in predicting subsequent college-level course performance. Differences based on subject area and race/ethnicity were explored. ^ The introduction of the BSET did not improve the performance of remedial completers in subsequent college-level courses in any of the subject areas. The BSET did have a negative impact on the success rate of students in remedial reading and mathematics courses. There was a significant decrease in minority students' likelihood of passing remedial reading and mathematics courses after the BSET was introduced. The reliability of the BSET is unacceptably low for all subject areas, based on estimates derived from administrations at M-DCC. Nevertheless, there was a significant positive relationship between BSET score and grade point average in subsequent college-level courses. This relationship varied by subject area and ethnicity, with the BSET reading score having no relationship with subsequent course performance for Black non-Hispanic students. ^ The BSET had no discernable positive effect on remedial student performance in subsequent college-level courses. In other words, the BSET has not enhanced the effectiveness of the remedial programs to prepare students for later coursework at M-DCC. The BSET had a negative impact on the progress and success of students in remedial reading and mathematics. ^
Resumo:
The low-frequency electromagnetic compatibility (EMC) is an increasingly important aspect in the design of practical systems to ensure the functional safety and reliability of complex products. The opportunities for using numerical techniques to predict and analyze system's EMC are therefore of considerable interest in many industries. As the first phase of study, a proper model, including all the details of the component, was required. Therefore, the advances in EMC modeling were studied with classifying analytical and numerical models. The selected model was finite element (FE) modeling, coupled with the distributed network method, to generate the model of the converter's components and obtain the frequency behavioral model of the converter. The method has the ability to reveal the behavior of parasitic elements and higher resonances, which have critical impacts in studying EMI problems. For the EMC and signature studies of the machine drives, the equivalent source modeling was studied. Considering the details of the multi-machine environment, including actual models, some innovation in equivalent source modeling was performed to decrease the simulation time dramatically. Several models were designed in this study and the voltage current cube model and wire model have the best result. The GA-based PSO method is used as the optimization process. Superposition and suppression of the fields in coupling the components were also studied and verified. The simulation time of the equivalent model is 80-100 times lower than the detailed model. All tests were verified experimentally. As the application of EMC and signature study, the fault diagnosis and condition monitoring of an induction motor drive was developed using radiated fields. In addition to experimental tests, the 3DFE analysis was coupled with circuit-based software to implement the incipient fault cases. The identification was implemented using ANN for seventy various faulty cases. The simulation results were verified experimentally. Finally, the identification of the types of power components were implemented. The results show that it is possible to identify the type of components, as well as the faulty components, by comparing the amplitudes of their stray field harmonics. The identification using the stray fields is nondestructive and can be used for the setups that cannot go offline and be dismantled
Resumo:
The importance of checking the normality assumption in most statistical procedures especially parametric tests cannot be over emphasized as the validity of the inferences drawn from such procedures usually depend on the validity of this assumption. Numerous methods have been proposed by different authors over the years, some popular and frequently used, others, not so much. This study addresses the performance of eighteen of the available tests for different sample sizes, significance levels, and for a number of symmetric and asymmetric distributions by conducting a Monte-Carlo simulation. The results showed that considerable power is not achieved for symmetric distributions when sample size is less than one hundred and for such distributions, the kurtosis test is most powerful provided the distribution is leptokurtic or platykurtic. The Shapiro-Wilk test remains the most powerful test for asymmetric distributions. We conclude that different tests are suitable under different characteristics of alternative distributions.
Resumo:
Detecting change points in epidemic models has been studied by many scholars. Yao (1993) summarized five existing test statistics in the literature. Out of those test statistics, it was observed that the likelihood ratio statistic showed its standout power. However, all of the existing test statistics are based on an assumption that population variance is known, which is an unrealistic assumption in practice. To avoid assuming known population variance, a new test statistic for detecting epidemic models is studied in this thesis. The new test statistic is a parameter-free test statistic which is more powerful compared to the existing test statistics. Different sample sizes and lengths of epidemic durations are used for the power comparison purpose. Monte Carlo simulation is used to find the critical values of the new test statistic and to perform the power comparison. Based on the Monte Carlo simulation result, it can be concluded that the sample size and the length of the duration have some effect on the power of the tests. It can also be observed that the new test statistic studied in this thesis has higher power than the existing test statistics do in all of cases.
Resumo:
The low-frequency electromagnetic compatibility (EMC) is an increasingly important aspect in the design of practical systems to ensure the functional safety and reliability of complex products. The opportunities for using numerical techniques to predict and analyze system’s EMC are therefore of considerable interest in many industries. As the first phase of study, a proper model, including all the details of the component, was required. Therefore, the advances in EMC modeling were studied with classifying analytical and numerical models. The selected model was finite element (FE) modeling, coupled with the distributed network method, to generate the model of the converter’s components and obtain the frequency behavioral model of the converter. The method has the ability to reveal the behavior of parasitic elements and higher resonances, which have critical impacts in studying EMI problems. For the EMC and signature studies of the machine drives, the equivalent source modeling was studied. Considering the details of the multi-machine environment, including actual models, some innovation in equivalent source modeling was performed to decrease the simulation time dramatically. Several models were designed in this study and the voltage current cube model and wire model have the best result. The GA-based PSO method is used as the optimization process. Superposition and suppression of the fields in coupling the components were also studied and verified. The simulation time of the equivalent model is 80-100 times lower than the detailed model. All tests were verified experimentally. As the application of EMC and signature study, the fault diagnosis and condition monitoring of an induction motor drive was developed using radiated fields. In addition to experimental tests, the 3DFE analysis was coupled with circuit-based software to implement the incipient fault cases. The identification was implemented using ANN for seventy various faulty cases. The simulation results were verified experimentally. Finally, the identification of the types of power components were implemented. The results show that it is possible to identify the type of components, as well as the faulty components, by comparing the amplitudes of their stray field harmonics. The identification using the stray fields is nondestructive and can be used for the setups that cannot go offline and be dismantled