935 resultados para ElGamal, CZK, Multiple discrete logarithm assumption, Extended linear algebra
Resumo:
AMS subject classification: 90C29.
Resumo:
2000 Mathematics Subject Classification: 35B50, 35L15.
Resumo:
Two-stage data envelopment analysis (DEA) efficiency models identify the efficient frontier of a two-stage production process. In some two-stage processes, the inputs to the first stage are shared by the second stage, known as shared inputs. This paper proposes a new relational linear DEA model for dealing with measuring the efficiency score of two-stage processes with shared inputs under constant returns-to-scale assumption. Two case studies of banking industry and university operations are taken as two examples to illustrate the potential applications of the proposed approach.
Resumo:
The seminal multiple-view stereo benchmark evaluations from Middlebury and by Strecha et al. have played a major role in propelling the development of multi-view stereopsis (MVS) methodology. The somewhat small size and variability of these data sets, however, limit their scope and the conclusions that can be derived from them. To facilitate further development within MVS, we here present a new and varied data set consisting of 80 scenes, seen from 49 or 64 accurate camera positions. This is accompanied by accurate structured light scans for reference and evaluation. In addition all images are taken under seven different lighting conditions. As a benchmark and to validate the use of our data set for obtaining reasonable and statistically significant findings about MVS, we have applied the three state-of-the-art MVS algorithms by Campbell et al., Furukawa et al., and Tola et al. to the data set. To do this we have extended the evaluation protocol from the Middlebury evaluation, necessitated by the more complex geometry of some of our scenes. The data set and accompanying evaluation framework are made freely available online. Based on this evaluation, we are able to observe several characteristics of state-of-the-art MVS, e.g. that there is a tradeoff between the quality of the reconstructed 3D points (accuracy) and how much of an object’s surface is captured (completeness). Also, several issues that we hypothesized would challenge MVS, such as specularities and changing lighting conditions did not pose serious problems. Our study finds that the two most pressing issues for MVS are lack of texture and meshing (forming 3D points into closed triangulated surfaces).
Resumo:
A high-performance fuel gauging sensor is described that uses five diaphragm-based pressure sensors, which are monitored using a linear array of polymer optical fiber Bragg gratings. The sensors were initially characterized using water, revealing a sensitivity of 98 pm/cm for four of the sensors and 86 pm/cm for the fifth. The discrepancy in the sensitivity of the fifth sensor has been explained as being a result of the annealing of the other four sensors. Initial testing in JET A-1 aviation fuel revealed the unsuitability of silicone rubber diaphragms for prolonged usage in fuel. A second set of sensors manufactured with a polyurethane-based diaphragm showed no measurable deterioration over a three month period immersed in fuel. These sensors exhibited a sensitivity of 39 pm/cm, which is less than the silicone rubber devices due to the stiffer nature of the polyurethane material used.
Resumo:
The objectives of this research are to analyze and develop a modified Principal Component Analysis (PCA) and to develop a two-dimensional PCA with applications in image processing. PCA is a classical multivariate technique where its mathematical treatment is purely based on the eigensystem of positive-definite symmetric matrices. Its main function is to statistically transform a set of correlated variables to a new set of uncorrelated variables over $\IR\sp{n}$ by retaining most of the variations present in the original variables.^ The variances of the Principal Components (PCs) obtained from the modified PCA form a correlation matrix of the original variables. The decomposition of this correlation matrix into a diagonal matrix produces a set of orthonormal basis that can be used to linearly transform the given PCs. It is this linear transformation that reproduces the original variables. The two-dimensional PCA can be devised as a two successive of one-dimensional PCA. It can be shown that, for an $m\times n$ matrix, the PCs obtained from the two-dimensional PCA are the singular values of that matrix.^ In this research, several applications for image analysis based on PCA are developed, i.e., edge detection, feature extraction, and multi-resolution PCA decomposition and reconstruction. ^
Resumo:
The market model is the most frequently estimated model in financial economics and has proven extremely useful in the estimation of systematic risk. In this era of rapid globalization of financial markets there has been a substantial increase in cross listings of stocks in foreign and regional capital markets. As many as a third to a half of the stocks in some major exchanges are foreign listed. The multiple listings of stocks has major implications for the estimation of systematic risk. The traditiona1 method of estimating the market model by using data from only one market will lead to misleading estimates of beta. This study demonstrates that the estimator for systematic risk and the methodology itself changes when stocks are listed in multiple markets. General expressions are developed to obtain the estimator of global beta under a variety of assumptions about the error terms of the market models for different capital markets. The assumptions pertain both to the volatilities of the abnormal returns in each market, and to the relationship between the markets. ^ Explicit expressions are derived for the estimation of global systematic risk beta when the returns are homoscedastic and also under different heteroscedastic conditions both within and/or between markets. These results for the estimation of global beta are further extended when return generating process follows an autoregressive scheme.^
Resumo:
Recently, wireless network technology has grown at such a pace that scientific research has become a practical reality in a very short time span. Mobile wireless communications have witnessed the adoption of several generations, each of them complementing and improving the former. One mobile system that features high data rates and open network architecture is 4G. Currently, the research community and industry, in the field of wireless networks, are working on possible choices for solutions in the 4G system. 4G is a collection of technologies and standards that will allow a range of ubiquitous computing and wireless communication architectures. The researcher considers one of the most important characteristics of future 4G mobile systems the ability to guarantee reliable communications from 100 Mbps, in high mobility links, to as high as 1 Gbps for low mobility users, in addition to high efficiency in the spectrum usage. On mobile wireless communications networks, one important factor is the coverage of large geographical areas. In 4G systems, a hybrid satellite/terrestrial network is crucial to providing users with coverage wherever needed. Subscribers thus require a reliable satellite link to access their services when they are in remote locations, where a terrestrial infrastructure is unavailable. Thus, they must rely upon satellite coverage. Good modulation and access technique are also required in order to transmit high data rates over satellite links to mobile users. This technique must adapt to the characteristics of the satellite channel and also be efficient in the use of allocated bandwidth. Satellite links are fading channels, when used by mobile users. Some measures designed to approach these fading environments make use of: (1) spatial diversity (two receive antenna configuration); (2) time diversity (channel interleaver/spreading techniques); and (3) upper layer FEC. The author proposes the use of OFDM (Orthogonal Frequency Multiple Access) for the satellite link by increasing the time diversity. This technique will allow for an increase of the data rate, as primarily required by multimedia applications, and will also optimally use the available bandwidth. In addition, this dissertation approaches the use of Cooperative Satellite Communications for hybrid satellite/terrestrial networks. By using this technique, the satellite coverage can be extended to areas where there is no direct link to the satellite. For this purpose, a good channel model is necessary.
Resumo:
This research aims at a study of the hybrid flow shop problem which has parallel batch-processing machines in one stage and discrete-processing machines in other stages to process jobs of arbitrary sizes. The objective is to minimize the makespan for a set of jobs. The problem is denoted as: FF: batch1,sj:Cmax. The problem is formulated as a mixed-integer linear program. The commercial solver, AMPL/CPLEX, is used to solve problem instances to their optimality. Experimental results show that AMPL/CPLEX requires considerable time to find the optimal solution for even a small size problem, i.e., a 6-job instance requires 2 hours in average. A bottleneck-first-decomposition heuristic (BFD) is proposed in this study to overcome the computational (time) problem encountered while using the commercial solver. The proposed BFD heuristic is inspired by the shifting bottleneck heuristic. It decomposes the entire problem into three sub-problems, and schedules the sub-problems one by one. The proposed BFD heuristic consists of four major steps: formulating sub-problems, prioritizing sub-problems, solving sub-problems and re-scheduling. For solving the sub-problems, two heuristic algorithms are proposed; one for scheduling a hybrid flow shop with discrete processing machines, and the other for scheduling parallel batching machines (single stage). Both consider job arrival and delivery times. An experiment design is conducted to evaluate the effectiveness of the proposed BFD, which is further evaluated against a set of common heuristics including a randomized greedy heuristic and five dispatching rules. The results show that the proposed BFD heuristic outperforms all these algorithms. To evaluate the quality of the heuristic solution, a procedure is developed to calculate a lower bound of makespan for the problem under study. The lower bound obtained is tighter than other bounds developed for related problems in literature. A meta-search approach based on the Genetic Algorithm concept is developed to evaluate the significance of further improving the solution obtained from the proposed BFD heuristic. The experiment indicates that it reduces the makespan by 1.93 % in average within a negligible time when problem size is less than 50 jobs.
Resumo:
Amidst concerns about achieving high levels of technology to remain competitive in the global market without compromising economic development, national economies are experiencing a high demand for human capital. As higher education is assumed to be the main source of human capital, this analysis focused on a more specific and less explored area of the generally accepted idea that higher education contributes to economic growth. The purpose of this study, therefore, was to find whether higher education also contributes to economic development, and whether that contribution is more substantial in a globalized context. ^ Consequently, a multiple linear regression analysis was conducted to support with statistical significance the answer to the research question: Does higher education contributes to economic development in the context of globalization? The information analyzed was obtained from historical data of 91 selected countries, and the period of time of the study was 10 years (1990–2000). Some variables, however, were lagged back 5, 10 or 15 years along a 15-year timeframe (1975–1990). The resulting comparative static model was based on the Cobb-Douglas production function and the Solow model to specify economic growth as a function of physical capital, labor, technology, and productivity. Then, formal education, economic development, and globalization were added to the equation. ^ The findings of this study supported the assumption that the independent contribution of the changes in higher education completion and globalization to changes in economic growth is more substantial than the contribution of their interaction. The results also suggested that changes in higher and secondary education completion contribute much more to changes in economic growth in less developed countries than in their more developed counterparts. ^ As a conclusion, based on the results of this study, I proposed the implementation of public policy in less developed countries to promote and expand adequate secondary and higher education systems with the purpose of helping in the achievement of economic development. I also recommended further research efforts on this topic to emphasize the contribution of education to the economy, mainly in less developed countries. ^
Resumo:
The purpose of this study was to determine whether there was a relationship between pressure to perform on state mandated, high-stakes tests and the rate of student escape behavior defined as the number of school suspensions and absences. The state assigned grade of a school was used as a surrogate measure of pressure with the assumption that pressure increased as the school grade decreased. Student attendance and suspension data were gathered from all 33 of the regular public high schools in Miami-Dade County Public Schools. The research questions were: Is the number of suspensions highest in the third quarter, when most FCAT preparation takes place for each of the 3 school years 2007-08 through 2009-10? How accurately does the high school's grade predict the number of suspensions and number of absences during each of the 4 school years 2005-06 through 2008-09? The research questions were answered using repeated measures analysis of variance for research question #1 and non-linear multiple regression for research question #2. No significant difference could be found between the numbers of suspensions in each of the grading periods nor was there a relationship between the number of suspensions and school grade. A statistically significant relationship was found between student attendance and school grade. When plotted, this relationship was found to be quadratic in nature and formed a loose inverted U for each of the four years during which data were collected. This indicated that students in very high and very low performing schools had low levels of absences while those in the midlevel of the distribution of school performance (C schools) had the greatest rates of absence. Identifying a relationship between the pressures associated with high stakes testing and student escape behavior suggests that it might be useful for building administrators to reevaluate test preparation activities and procedures being used in their building and to include anxiety reducing strategies. As a relationship was found, it sets the foundation for future studies to identify whether testing related activities are impacting some students emotionally and are causing unintended consequences of testing mandates.
Resumo:
Multiple linear regression model plays a key role in statistical inference and it has extensive applications in business, environmental, physical and social sciences. Multicollinearity has been a considerable problem in multiple regression analysis. When the regressor variables are multicollinear, it becomes difficult to make precise statistical inferences about the regression coefficients. There are some statistical methods that can be used, which are discussed in this thesis are ridge regression, Liu, two parameter biased and LASSO estimators. Firstly, an analytical comparison on the basis of risk was made among ridge, Liu and LASSO estimators under orthonormal regression model. I found that LASSO dominates least squares, ridge and Liu estimators over a significant portion of the parameter space for large dimension. Secondly, a simulation study was conducted to compare performance of ridge, Liu and two parameter biased estimator by their mean squared error criterion. I found that two parameter biased estimator performs better than its corresponding ridge regression estimator. Overall, Liu estimator performs better than both ridge and two parameter biased estimator.
Resumo:
Many U.S. students do not perform well on mathematics assessments with respect to algebra topics such as linear functions, a building-block for other functions. Poor achievement of U.S. middle school students in this topic is a problem. U.S. eighth graders have had average mathematics scores on international comparison tests such as Third International Mathematics Science Study, later known as Trends in Mathematics and Science Study, (TIMSS)-1995, -99, -03, while Singapore students have had highest average scores. U.S. eighth grade average mathematics scores improved on TIMMS-2007 and held steady onTIMMS-2011. Results from national assessments, PISA 2009 and 2012 and National Assessment of Educational Progress of 2007, 2009, and 2013, showed a lack of proficiency in algebra. Results of curriculum studies involving nations in TIMSS suggest that elementary textbooks in high-scoring countries were different than elementary textbooks and middle grades texts were different with respect to general features in the U.S. The purpose of this study was to compare treatments of linear functions in Singapore and U.S. middle grades mathematics textbooks. Results revealed features currently in textbooks. Findings should be valuable to constituencies who wish to improve U.S. mathematics achievement. Portions of eight Singapore and nine U.S. middle school student texts pertaining to linear functions were compared with respect to 22 features in three categories: (a) background features, (b) general features of problems, and (c) specific characterizations of problem practices, problem-solving competency types, and transfer of representation. Features were coded using a codebook developed by the researcher. Tallies and percentages were reported. Welch's t-tests and chi-square tests were used, respectively, to determine whether texts differed significantly for the features and if codes were independent of country. U.S. and Singapore textbooks differed in page appearance and number of pages, problems, and images. Texts were similar in problem appearance. Differences in problems related to assessment of conceptual learning. U.S. texts contained more problems requiring (a) use of definitions, (b) single computation, (c) interpreting, and (d) multiple responses. These differences may stem from cultural differences seen in attitudes toward education. Future studies should focus on density of page, spiral approach, and multiple response problems.
Resumo:
Recently, wireless network technology has grown at such a pace that scientific research has become a practical reality in a very short time span. One mobile system that features high data rates and open network architecture is 4G. Currently, the research community and industry, in the field of wireless networks, are working on possible choices for solutions in the 4G system. The researcher considers one of the most important characteristics of future 4G mobile systems the ability to guarantee reliable communications at high data rates, in addition to high efficiency in the spectrum usage. On mobile wireless communication networks, one important factor is the coverage of large geographical areas. In 4G systems, a hybrid satellite/terrestrial network is crucial to providing users with coverage wherever needed. Subscribers thus require a reliable satellite link to access their services when they are in remote locations where a terrestrial infrastructure is unavailable. The results show that good modulation and access technique are also required in order to transmit high data rates over satellite links to mobile users. The dissertation proposes the use of OFDM (Orthogonal Frequency Multiple Access) for the satellite link by increasing the time diversity. This technique will allow for an increase of the data rate, as primarily required by multimedia applications, and will also optimally use the available bandwidth. In addition, this dissertation approaches the use of Cooperative Satellite Communications for hybrid satellite/terrestrial networks. By using this technique, the satellite coverage can be extended to areas where there is no direct link to the satellite. The issue of Cooperative Satellite Communications is solved through a new algorithm that forwards the received data from the fixed node to the mobile node. This algorithm is very efficient because it does not allow unnecessary transmissions and is based on signal to noise ratio (SNR) measures.
Resumo:
The main focus of this research is to design and develop a high performance linear actuator based on a four bar mechanism. The present work includes the detailed analysis (kinematics and dynamics), design, implementation and experimental validation of the newly designed actuator. High performance is characterized by the acceleration of the actuator end effector. The principle of the newly designed actuator is to network the four bar rhombus configuration (where some bars are extended to form an X shape) to attain high acceleration. Firstly, a detailed kinematic analysis of the actuator is presented and kinematic performance is evaluated through MATLAB simulations. A dynamic equation of the actuator is achieved by using the Lagrangian dynamic formulation. A SIMULINK control model of the actuator is developed using the dynamic equation. In addition, Bond Graph methodology is presented for the dynamic simulation. The Bond Graph model comprises individual component modeling of the actuator along with control. Required torque was simulated using the Bond Graph model. Results indicate that, high acceleration (around 20g) can be achieved with modest (3 N-m or less) torque input. A practical prototype of the actuator is designed using SOLIDWORKS and then produced to verify the proof of concept. The design goal was to achieve the peak acceleration of more than 10g at the middle point of the travel length, when the end effector travels the stroke length (around 1 m). The actuator is primarily designed to operate in standalone condition and later to use it in the 3RPR parallel robot. A DC motor is used to operate the actuator. A quadrature encoder is attached with the DC motor to control the end effector. The associated control scheme of the actuator is analyzed and integrated with the physical prototype. From standalone experimentation of the actuator, around 17g acceleration was achieved by the end effector (stroke length was 0.2m to 0.78m). Results indicate that the developed dynamic model results are in good agreement. Finally, a Design of Experiment (DOE) based statistical approach is also introduced to identify the parametric combination that yields the greatest performance. Data are collected by using the Bond Graph model. This approach is helpful in designing the actuator without much complexity.