907 resultados para Link variables method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Communication between power converters is vital for high performance DC micro-grids controls. However, for residential DC micro-grid applications, using external communication link would increase the system cost, and reduce the system flexibility and reliability. This paper presents a novel method to enable the conventional DC/DC converters to transmit data via the common DC Bus. With this technology, cost-effective low bandwidth communication links between power converters can be established within a DC micro-grid, and advanced distributed control algorithms can be developed. A reliable communication with 2 kbps transmission rate has been implemented between the Boost converters through the common input DC bus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many organic compounds cause an irreversible damage to human health and the ecosystem and are present in water resources. Among these hazard substances, phenolic compounds play an important role on the actual contamination. Utilization of membrane technology is increasing exponentially in drinking water production and waste water treatment. The removal of organic compounds by nanofiltration membranes is characterized not only by molecular sieving effects but also by membrane-solute interactions. Influence of the sieving parameters (molecular weight and molecular diameter) and the physicochemical interactions (dissociation constant and molecular hydrophobicity) on the membrane rejection of the organic solutes were studied. The molecular hydrophobicity is expressed as logarithm of octanol-water partition coefficient. This paper proposes a method used that can be used for symbolic knowledge extraction from a trained neural network, once they have been trained with the desired performance and is based on detect the more important variables in problems where exist multicolineality among the input variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Let us have an indirectly measurable variable which is a function of directly measurable variables. In this survey we present the introduced by us method for analytical representation of its maximum absolute and relative inaccuracy as functions, respectively, of the maximum absolute and of the relative inaccuracies of the directly measurable variables. Our new approach consists of assuming for fixed variables the statistical mean values of the absolute values of the coefficients of influence, respectively, of the absolute and relative inaccuracies of the directly measurable variables in order to determine the analytical form of the maximum absolute and relative inaccuracies of an indirectly measurable variable. Moreover, we give a method for determining the numerical values of the maximum absolute and relative inaccuracies. We define a sample plane of the ideal perfectly accurate experiment and using it we give a universal numerical characteristic – a dimensionless scale for determining the quality (accuracy) of the experiment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2002 Mathematics Subject Classification: 62J05, 62G35.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In non-linear random effects some attention has been very recently devoted to the analysis ofsuitable transformation of the response variables separately (Taylor 1996) or not (Oberg and Davidian 2000) from the transformations of the covariates and, as far as we know, no investigation has been carried out on the choice of link function in such models. In our study we consider the use of a random effect model when a parameterized family of links (Aranda-Ordaz 1981, Prentice 1996, Pregibon 1980, Stukel 1988 and Czado 1997) is introduced. We point out the advantages and the drawbacks associated with the choice of this data-driven kind of modeling. Difficulties in the interpretation of regression parameters, and therefore in understanding the influence of covariates, as well as problems related to loss of efficiency of estimates and overfitting, are discussed. A case study on radiotherapy usage in breast cancer treatment is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: To develop a tool for the accurate reporting and aggregation of findings from each of the multiple methods used in a complex evaluation in an unbiased way. Study Design and Setting: We developed a Method for Aggregating The Reporting of Interventions in Complex Studies (MATRICS) within a gastroenterology study [Evaluating New Innovations in (the delivery and organisation of) Gastrointestinal (GI) endoscopy services by the NHS Modernisation Agency (ENIGMA)]. We subsequently tested it on a different gastroenterology trial [Multi-Institutional Nurse Endoscopy Trial (MINuET)]. We created three layers to define the effects, methods, and findings from ENIGMA. We assigned numbers to each effect in layer 1 and letters to each method in layer 2. We used an alphanumeric code based on layers 1 and 2 to every finding in layer 3 to link the aims, methods, and findings. We illustrated analogous findings by assigning more than one alphanumeric code to a finding. We also showed that more than one effect or method could report the same finding. We presented contradictory findings by listing them in adjacent rows of the MATRICS. Results: MATRICS was useful for the effective synthesis and presentation of findings of the multiple methods from ENIGMA. We subsequently successfully tested it by applying it to the MINuET trial. Conclusion: MATRICS is effective for synthesizing the findings of complex, multiple-method studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of the multiple indicators, multiple causes model to operationalize formative variables (the formative MIMIC model) is advocated in the methodological literature. Yet, contrary to popular belief, the formative MIMIC model does not provide a valid method of integrating formative variables into empirical studies and we recommend discarding it from formative models. Our arguments rest on the following observations. First, much formative variable literature appears to conceptualize a causal structure between the formative variable and its indicators which can be tested or estimated. We demonstrate that this assumption is illogical, that a formative variable is simply a researcher-defined composite of sub-dimensions, and that such tests and estimates are unnecessary. Second, despite this, researchers often use the formative MIMIC model as a means to include formative variables in their models and to estimate the magnitude of linkages between formative variables and their indicators. However, the formative MIMIC model cannot provide this information since it is simply a model in which a common factor is predicted by some exogenous variables—the model does not integrate within it a formative variable. Empirical results from such studies need reassessing, since their interpretation may lead to inaccurate theoretical insights and the development of untested recommendations to managers. Finally, the use of the formative MIMIC model can foster fuzzy conceptualizations of variables, particularly since it can erroneously encourage the view that a single focal variable is measured with formative and reflective indicators. We explain these interlinked arguments in more detail and provide a set of recommendations for researchers to consider when dealing with formative variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For wireless power transfer (WPT) systems, communication between the primary side and the pickup side is a challenge because of the large air gap and magnetic interferences. A novel method, which integrates bidirectional data communication into a high-power WPT system, is proposed in this paper. The power and data transfer share the same inductive link between coreless coils. Power/data frequency division multiplexing technique is applied, and the power and data are transmitted by employing different frequency carriers and controlled independently. The circuit model of the multiband system is provided to analyze the transmission gain of the communication channel, as well as the power delivery performance. The crosstalk interference between two carriers is discussed. In addition, the signal-to-noise ratios of the channels are also estimated, which gives a guideline for the design of mod/demod circuits. Finally, a 500-W WPT prototype has been built to demonstrate the effectiveness of the proposed WPT system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Annual average daily traffic (AADT) is important information for many transportation planning, design, operation, and maintenance activities, as well as for the allocation of highway funds. Many studies have attempted AADT estimation using factor approach, regression analysis, time series, and artificial neural networks. However, these methods are unable to account for spatially variable influence of independent variables on the dependent variable even though it is well known that to many transportation problems, including AADT estimation, spatial context is important. ^ In this study, applications of geographically weighted regression (GWR) methods to estimating AADT were investigated. The GWR based methods considered the influence of correlations among the variables over space and the spatially non-stationarity of the variables. A GWR model allows different relationships between the dependent and independent variables to exist at different points in space. In other words, model parameters vary from location to location and the locally linear regression parameters at a point are affected more by observations near that point than observations further away. ^ The study area was Broward County, Florida. Broward County lies on the Atlantic coast between Palm Beach and Miami-Dade counties. In this study, a total of 67 variables were considered as potential AADT predictors, and six variables (lanes, speed, regional accessibility, direct access, density of roadway length, and density of seasonal household) were selected to develop the models. ^ To investigate the predictive powers of various AADT predictors over the space, the statistics including local r-square, local parameter estimates, and local errors were examined and mapped. The local variations in relationships among parameters were investigated, measured, and mapped to assess the usefulness of GWR methods. ^ The results indicated that the GWR models were able to better explain the variation in the data and to predict AADT with smaller errors than the ordinary linear regression models for the same dataset. Additionally, GWR was able to model the spatial non-stationarity in the data, i.e., the spatially varying relationship between AADT and predictors, which cannot be modeled in ordinary linear regression. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research pursued the conceptualization, implementation, and verification of a system that enhances digital information displayed on an LCD panel to users with visual refractive errors. The target user groups for this system are individuals who have moderate to severe visual aberrations for which conventional means of compensation, such as glasses or contact lenses, does not improve their vision. This research is based on a priori knowledge of the user's visual aberration, as measured by a wavefront analyzer. With this information it is possible to generate images that, when displayed to this user, will counteract his/her visual aberration. The method described in this dissertation advances the development of techniques for providing such compensation by integrating spatial information in the image as a means to eliminate some of the shortcomings inherent in using display devices such as monitors or LCD panels. Additionally, physiological considerations are discussed and integrated into the method for providing said compensation. In order to provide a realistic sense of the performance of the methods described, they were tested by mathematical simulation in software, as well as by using a single-lens high resolution CCD camera that models an aberrated eye, and finally with human subjects having various forms of visual aberrations. Experiments were conducted on these systems and the data collected from these experiments was evaluated using statistical analysis. The experimental results revealed that the pre-compensation method resulted in a statistically significant improvement in vision for all of the systems. Although significant, the improvement was not as large as expected for the human subject tests. Further analysis suggest that even under the controlled conditions employed for testing with human subjects, the characterization of the eye may be changing. This would require real-time monitoring of relevant variables (e.g. pupil diameter) and continuous adjustment in the pre-compensation process to yield maximum viewing enhancement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Because past research has shown faculty as the driving force affecting student academic library use, librarians have tried for decades to engage classroom faculty in library activities. Nevertheless, a low rate of library use by faculty on behalf of their students persists. This study investigated the organizational culture dimensions affecting library faculty demand at a community college. The study employed a sequential quantitative-qualitative research design. A random sample of full-time faculty at a large urban community college responded to a 46-item survey. The survey data showed strong espoused support (84%) for the use of library-based materials but a much lower incidence of putting this construct into practice (46%). Interviews were conducted with 11 full-time faculty from two academic groups, English-Humanities and Engineering-Math-Science. These groups were selected because the survey data resulted in statistically significant differences between the groups pertaining to several key variables. These variables concerned the professors' perceptions of the importance of library research in their discipline, the amount of time spent on the course textbook during a term, the frequency of conversations about the library in the academic department, and the professors' ratings of the librarians' skill in instruction related to the academic discipline. All interviewees described the student culture as the predominant organizational culture at Major College. Although most interview subjects held to high information literacy standards in their courses, others were less convinced these could be realistically practiced, based on a perception of students' poor academic skills, lack of time for students to complete assignments due to their commuter and family responsibilities, and the need to focus on textbook content. Recommended future research would involve investigation of methods to bridge the gap between high espoused value toward information literacy and implementation of information-literate coursework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerical optimization is a technique where a computer is used to explore design parameter combinations to find extremes in performance factors. In multi-objective optimization several performance factors can be optimized simultaneously. The solution to multi-objective optimization problems is not a single design, but a family of optimized designs referred to as the Pareto frontier. The Pareto frontier is a trade-off curve in the objective function space composed of solutions where performance in one objective function is traded for performance in others. A Multi-Objective Hybridized Optimizer (MOHO) was created for the purpose of solving multi-objective optimization problems by utilizing a set of constituent optimization algorithms. MOHO tracks the progress of the Pareto frontier approximation development and automatically switches amongst those constituent evolutionary optimization algorithms to speed the formation of an accurate Pareto frontier approximation. Aerodynamic shape optimization is one of the oldest applications of numerical optimization. MOHO was used to perform shape optimization on a 0.5-inch ballistic penetrator traveling at Mach number 2.5. Two objectives were simultaneously optimized: minimize aerodynamic drag and maximize penetrator volume. This problem was solved twice. The first time the problem was solved by using Modified Newton Impact Theory (MNIT) to determine the pressure drag on the penetrator. In the second solution, a Parabolized Navier-Stokes (PNS) solver that includes viscosity was used to evaluate the drag on the penetrator. The studies show the difference in the optimized penetrator shapes when viscosity is absent and present in the optimization. In modern optimization problems, objective function evaluations may require many hours on a computer cluster to perform these types of analysis. One solution is to create a response surface that models the behavior of the objective function. Once enough data about the behavior of the objective function has been collected, a response surface can be used to represent the actual objective function in the optimization process. The Hybrid Self-Organizing Response Surface Method (HYBSORSM) algorithm was developed and used to make response surfaces of objective functions. HYBSORSM was evaluated using a suite of 295 non-linear functions. These functions involve from 2 to 100 variables demonstrating robustness and accuracy of HYBSORSM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examined the predictive merits of selected cognitive and noncognitive variables on the national Registry exam pass rate using 2008 graduates (n = 175) from community college radiography programs in Florida. The independent variables included two GPAs, final grades in five radiography courses, self-efficacy, and social support. The dependent variable was the first-attempt results on the national Registry exam. The design was a retrospective predictive study that relied on academic data collected from participants using the self-report method and on perceptions of students' success on the national Registry exam collected through a questionnaire developed and piloted in the study. All independent variables except self-efficacy and social support correlated with success on the national Registry exam ( p < .01) using the Pearson Product-Moment Correlation analysis. The strongest predictor of the national Registry exam success was the end-of-program GPA, r = .550, p < .001. The GPAs and scores for self-efficacy and social support were entered into a logistic regression analysis to produce a prediction model. The end-of-program GPA (p = .015) emerged as a significant variable. This model predicted 44% of the students who failed the national Registry exam and 97.3% of those who passed, explaining 45.8% of the variance. A second model included the final grades for the radiography courses, self efficacy, and social support. Three courses significantly predicted national Registry exam success; Radiographic Exposures, p < .001; Radiologic Physics, p = .014; and Radiation Safety & Protection, p = .044, explaining 56.8% of the variance. This model predicted 64% of the students who failed the national Registry exam and 96% of those who passed. The findings support the use of in-program data as accurate predictors of success on the national Registry exam.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation aimed to improve travel time estimation for the purpose of transportation planning by developing a travel time estimation method that incorporates the effects of signal timing plans, which were difficult to consider in planning models. For this purpose, an analytical model has been developed. The model parameters were calibrated based on data from CORSIM microscopic simulation, with signal timing plans optimized using the TRANSYT-7F software. Independent variables in the model are link length, free-flow speed, and traffic volumes from the competing turning movements. The developed model has three advantages compared to traditional link-based or node-based models. First, the model considers the influence of signal timing plans for a variety of traffic volume combinations without requiring signal timing information as input. Second, the model describes the non-uniform spatial distribution of delay along a link, this being able to estimate the impacts of queues at different upstream locations of an intersection and attribute delays to a subject link and upstream link. Third, the model shows promise of improving the accuracy of travel time prediction. The mean absolute percentage error (MAPE) of the model is 13% for a set of field data from Minnesota Department of Transportation (MDOT); this is close to the MAPE of uniform delay in the HCM 2000 method (11%). The HCM is the industrial accepted analytical model in the existing literature, but it requires signal timing information as input for calculating delays. The developed model also outperforms the HCM 2000 method for a set of Miami-Dade County data that represent congested traffic conditions, with a MAPE of 29%, compared to 31% of the HCM 2000 method. The advantages of the proposed model make it feasible for application to a large network without the burden of signal timing input, while improving the accuracy of travel time estimation. An assignment model with the developed travel time estimation method has been implemented in a South Florida planning model, which improved assignment results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this dissertation, I present an integrated model of organizational performance. Most prior research has relied extensively on testing individual linkages, often with cross-sectional data. In this dissertation, longitudinal unit-level data from 559 restaurants, collected over a one-year period, were used to test the proposed model. The model was hypothesized to begin with employee satisfaction as a key antecedent that would ultimately lead to improved financial performance. Several variables including turnover, efficiency, and guest satisfaction are proposed as mediators of the satisfaction-performance relationship. The current findings replicate and extend past research using individual-level data. The overall model adequately explained the data, but was significantly improved with an additional link from employee satisfaction to efficiency, which was not originally hypothesized. Management turnover was a strong predictor of hourly level team turnover, and both were significant predictors of efficiency. Full findings for each hypothesis are presented and practical organizational implications are given. Limitations and recommendations for future research are provided. ^