991 resultados para Linear combination


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we present a novel indexing technique called Multi-scale Similarity Indexing (MSI) to index image's multi-features into a single one-dimensional structure. Both for text and visual feature spaces, the similarity between a point and a local partition's center in individual space is used as the indexing key, where similarity values in different features are distinguished by different scale. Then a single indexing tree can be built on these keys. Based on the property that relevant images have similar similarity values from the center of the same local partition in any feature space, certain number of irrelevant images can be fast pruned based on the triangle inequity on indexing keys. To remove the dimensionality curse existing in high dimensional structure, we propose a new technique called Local Bit Stream (LBS). LBS transforms image's text and visual feature representations into simple, uniform and effective bit stream (BS) representations based on local partition's center. Such BS representations are small in size and fast for comparison since only bit operation are involved. By comparing common bits existing in two BSs, most of irrelevant images can be immediately filtered. To effectively integrate multi-features, we also investigated the following evidence combination techniques-Certainty Factor, Dempster Shafer Theory, Compound Probability, and Linear Combination. Our extensive experiment showed that single one-dimensional index on multi-features improves multi-indices on multi-features greatly. Our LBS method outperforms sequential scan on high dimensional space by an order of magnitude. And Certainty Factor and Dempster Shafer Theory perform best in combining multiple similarities from corresponding multiple features.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Functionally-fitted methods are generalizations of collocation techniques to integrate an equation exactly if its solution is a linear combination of a chosen set of basis functions. When these basis functions are chosen as the power functions, we recover classical algebraic collocation methods. This paper shows that functionally-fitted methods can be derived with less restrictive conditions than previously stated in the literature, and that other related results can be derived in a much more elegant way. The novelty in our approach is to fully retain the collocation framework without reverting back into derivations based on cumbersome Taylor series expansions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A method has been constructed for the solution of a wide range of chemical plant simulation models including differential equations and optimization. Double orthogonal collocation on finite elements is applied to convert the model into an NLP problem that is solved either by the VF 13AD package based on successive quadratic programming, or by the GRG2 package, based on the generalized reduced gradient method. This approach is termed simultaneous optimization and solution strategy. The objective functional can contain integral terms. The state and control variables can have time delays. Equalities and inequalities containing state and control variables can be included into the model as well as algebraic equations and inequalities. The maximum number of independent variables is 2. Problems containing 3 independent variables can be transformed into problems having 2 independent variables using finite differencing. The maximum number of NLP variables and constraints is 1500. The method is also suitable for solving ordinary and partial differential equations. The state functions are approximated by a linear combination of Lagrange interpolation polynomials. The control function can either be approximated by a linear combination of Lagrange interpolation polynomials or by a piecewise constant function over finite elements. The number of internal collocation points can vary by finite elements. The residual error is evaluated at arbitrarily chosen equidistant grid-points, thus enabling the user to check the accuracy of the solution between collocation points, where the solution is exact. The solution functions can be tabulated. There is an option to use control vector parameterization to solve optimization problems containing initial value ordinary differential equations. When there are many differential equations or the upper integration limit should be selected optimally then this approach should be used. The portability of the package has been addressed converting the package from V AX FORTRAN 77 into IBM PC FORTRAN 77 and into SUN SPARC 2000 FORTRAN 77. Computer runs have shown that the method can reproduce optimization problems published in the literature. The GRG2 and the VF I 3AD packages, integrated into the optimization package, proved to be robust and reliable. The package contains an executive module, a module performing control vector parameterization and 2 nonlinear problem solver modules, GRG2 and VF I 3AD. There is a stand-alone module that converts the differential-algebraic optimization problem into a nonlinear programming problem.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cochlear implants are prosthetic devices used to provide hearing to people who would otherwise be profoundly deaf. The deliberate addition of noise to the electrode signals could increase the amount of information transmitted, but standard cochlear implants do not replicate the noise characteristic of normal hearing because if noise is added in an uncontrolled manner with a limited number of electrodes then it will almost certainly lead to worse performance. Only if partially independent stochastic activity can be achieved in each nerve fibre can mechanisms like suprathreshold stochastic resonance be effective. We are investigating the use of stochastic beamforming to achieve greater independence. The strategy involves presenting each electrode with a linear combination of independent Gaussian noise sources. Because the cochlea is filled with conductive salt solutions, the noise currents from the electrodes interact and the effective stimulus for each nerve fibre will therefore be a different weighted sum of the noise sources. To some extent therefore, the effective stimulus for a nerve fibre will be independent of the effective stimulus of neighbouring fibres. For a particular patient, the electrode position and the amount of current spread are fixed. The objective is therefore to find the linear combination of noise sources that leads to the greatest independence between nerve discharges. In this theoretical study we show that it is possible to get one independent point of excitation (one null) for each electrode and that stochastic beamforming can greatly decrease the correlation between the noise exciting different regions of the cochlea. © 2007 Copyright SPIE - The International Society for Optical Engineering.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We extend a meshless method of fundamental solutions recently proposed by the authors for the one-dimensional two-phase inverse linear Stefan problem, to the nonlinear case. In this latter situation the free surface is also considered unknown which is more realistic from the practical point of view. Building on the earlier work, the solution is approximated in each phase by a linear combination of fundamental solutions to the heat equation. The implementation and analysis are more complicated in the present situation since one needs to deal with a nonlinear minimization problem to identify the free surface. Furthermore, the inverse problem is ill-posed since small errors in the input measured data can cause large deviations in the desired solution. Therefore, regularization needs to be incorporated in the objective function which is minimized in order to obtain a stable solution. Numerical results are presented and discussed. © 2014 IMACS.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62L10, 62L15.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The visual system combines spatial signals from the two eyes to achieve single vision. But if binocular disparity is too large, this perceptual fusion gives way to diplopia. We studied and modelled the processes underlying fusion and the transition to diplopia. The likely basis for fusion is linear summation of inputs onto binocular cortical cells. Previous studies of perceived position, contrast matching and contrast discrimination imply the computation of a dynamicallyweighted sum, where the weights vary with relative contrast. For gratings, perceived contrast was almost constant across all disparities, and this can be modelled by allowing the ocular weights to increase with disparity (Zhou, Georgeson & Hess, 2014). However, when a single Gaussian-blurred edge was shown to each eye perceived blur was invariant with disparity (Georgeson & Wallis, ECVP 2012) – not consistent with linear summation (which predicts that perceived blur increases with disparity). This blur constancy is consistent with a multiplicative form of combination (the contrast-weighted geometric mean) but that is hard to reconcile with the evidence favouring linear combination. We describe a 2-stage spatial filtering model with linear binocular combination and suggest that nonlinear output transduction (eg. ‘half-squaring’) at each stage may account for the blur constancy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Popular dimension reduction and visualisation algorithms rely on the assumption that input dissimilarities are typically Euclidean, for instance Metric Multidimensional Scaling, t-distributed Stochastic Neighbour Embedding and the Gaussian Process Latent Variable Model. It is well known that this assumption does not hold for most datasets and often high-dimensional data sits upon a manifold of unknown global geometry. We present a method for improving the manifold charting process, coupled with Elastic MDS, such that we no longer assume that the manifold is Euclidean, or of any particular structure. We draw on the benefits of different dissimilarity measures allowing for the relative responsibilities, under a linear combination, to drive the visualisation process.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nowadays, the scientific and social significance of the research of climatic effects has become outstanding. In order to be able to predict the ecological effects of the global climate change, it is necessary to study monitoring databases of the past and explore connections. For the case study mentioned in the title, historical weather data series from the Hungarian Meteorological Service and Szaniszló Priszter’s monitoring data on the phenology of geophytes have been used. These data describe on which days the observed geophytes budded, were blooming and withered. In our research we have found that the classification of the observed years according to phenological events and the classification of those according to the frequency distribution of meteorological parameters show similar patterns, and the one variable group is suitable for explaining the pattern shown by the other one. Furthermore, our important result is that the dates of all three observed phenophases correlate significantly with the average of the daily temperature fluctuation in the given period. The second most often significant parameter is the number of frosty days, this also seem to be determinant for all phenophases. Usual approaches based on the temperature sum and the average temperature don’t seem to be really important in this respect. According to the results of the research, it has turned out that the phenology of geophytes can be well modelled with the linear combination of suitable meteorological parameters

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this study was to investigate the relationship between school principals' self-reported spirituality and their transformational leadership behaviors. The relationship between spirituality and transactional leadership behaviors was also explored. The study used Bass and Avolio's (1984) Full Range Leadership Model as the theoretical framework conceptualizing transformational leadership. Data were collected using online surveys. Overall, six principals and sixty-nine teachers participated in the study. Principal surveys contained three parts: the Multifactor Leadership Questionnaire (MLQ Form-5X Short), the modified Spirituality Well-Being Scale (SWBS) and demographic information. Teacher surveys included two parts: the MLQ-5X and demographic information. The MLQ-5X was used to identify the degree of principals' transformational and transactional leadership behaviors. The modified SWBS (Existential Well Being) was used to determine principals' degree of spirituality. The correlation coefficients for the transformational leadership styles of inspirational motivation and idealized behavioral influence were significantly related to principals' spirituality. In addition, a multiple regression analysis including the five measures of transformational leadership as predictors suggested that spirituality is positively related to an individual's transformational leadership behaviors. A multiple regression analysis utilizing a linear combination of all transformational leadership and transactional measures was predictive of spirituality. Finally, it appears that the inspirational motivation measure of transformational leadership accounts for a significant amount of unique variance independent of the other seven transformational and transactional leadership measures in predicting spirituality. Based on the findings from this study, the researcher proposed a modification of Bass and Avolio's (1985) Full Range Leadership Model. An additional dimension, spirituality, was added to the continuum of leadership styles. The findings from this study imply that principals' self-reported levels of spirituality was related to their being perceived as displaying transformational leadership behaviors. Principals who identified themselves as "spiritual", were more likely to be characterized by the transformational leadership style of inspirational motivation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Poor informational reading and writing skills in early grades and the need to provide students more experience with informational text have been identified by research as areas of concern. Wilkinson and Son (2011) support future research in dialogic approaches to investigate the impact dialogic teaching has on comprehension. This study (N = 39) examined the gains in reading comprehension, science achievement, and metacognitive functioning of individual second grade students interacting with instructors using dialogue journals alongside their textbook. The 38 week study consisted of two instructional phases, and three assessment points. After a period of oral metacognitive strategies, one class formed the treatment group (n=17), consisting of two teachers following the co-teaching method, and two classes formed the comparison group ( n=22). The dialogue journal intervention for the treatment group embraced the transactional theory of instruction through the use of dialogic interaction between teachers and students. Students took notes on the assigned lesson after an oral discussion. Teachers responded to students' entries with scaffolding using reading strategies (prior knowledge, skim, slow down, mental integration, and diagrams) modeled after Schraw's (1998) strategy evaluation matrix, to enhance students' comprehension. The comparison group utilized text-based, teacher-led whole group discussion. Data were collected using different measures: (a) Florida Assessments for Instruction in Reading (FAIR) Broad Diagnostic Inventory; (b) Scott Foresman end of chapter tests; (c) Metacomprehension Strategy Index (Schmitt, 1990); and (d) researcher-made metacognitive scaffolding rubric. Statistical analyses were performed using paired sample t-tests, regression analysis of covariance, and two way analysis of covariance. Findings from the study revealed that experimental participants performed significantly better on the linear combination of reading comprehension, science achievement, and metacognitive function, than their comparison group counterparts while controlling for pretest scores. Overall, results from the study established that teacher scaffolding using metacognitive strategies can potentially develop students' reading comprehension, science achievement, and metacognitive awareness. This suggests that early childhood students gain from the integration of reading and writing when using authentic materials (science textbooks) in science classrooms. A replication of this study with more students across more schools, and different grade levels would improve the generalizability of these results.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There is a national need to increase the STEM-related workforce. Among factors leading towards STEM careers include the number of advanced high school mathematics and science courses students complete. Florida's enrollment patterns in STEM-related Advanced Placement (AP) courses, however, reveal that only a small percentage of students enroll into these classes. Therefore, screening tools are needed to find more students for these courses, who are academically ready, yet have not been identified. The purpose of this study was to investigate the extent to which scores from a national standardized test, Preliminary Scholastic Assessment Test/ National Merit Qualifying Test (PSAT/NMSQT), in conjunction with and compared to a state-mandated standardized test, Florida Comprehensive Assessment Test (FCAT), are related to selected AP exam performance in Seminole County Public Schools. An ex post facto correlational study was conducted using 6,189 student records from the 2010 - 2012 academic years. Multiple regression analyses using simultaneous Full Model testing showed differential moderate to strong relationships between scores in eight of the nine AP courses (i.e., Biology, Environmental Science, Chemistry, Physics B, Physics C Electrical, Physics C Mechanical, Statistics, Calculus AB and BC) examined. For example, the significant unique contribution to overall variance in AP scores was a linear combination of PSAT Math (M), Critical Reading (CR) and FCAT Reading (R) for Biology and Environmental Science. Moderate relationships for Chemistry included a linear combination of PSAT M, W (Writing) and FCAT M; a combination of FCAT M and PSAT M was most significantly associated with Calculus AB performance. These findings have implications for both research and practice. FCAT scores, in conjunction with PSAT scores, can potentially be used for specific STEM-related AP courses, as part of a systematic approach towards AP course identification and placement. For courses with moderate to strong relationships, validation studies and development of expectancy tables, which estimate the probability of successful performance on these AP exams, are recommended. Also, findings established a need to examine other related research issues including, but not limited to, extensive longitudinal studies and analyses of other available or prospective standardized test scores.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the global phenomena with threats to environmental health and safety is artisanal mining. There are ambiguities in the manner in which an ore-processing facility operates which hinders the mining capacity of these miners in Ghana. These problems are reviewed on the basis of current socio-economic, health and safety, environmental, and use of rudimentary technologies which limits fair-trade deals to miners. This research sought to use an established data-driven, geographic information (GIS)-based system employing the spatial analysis approach for locating a centralized processing facility within the Wassa Amenfi-Prestea Mining Area (WAPMA) in the Western region of Ghana. A spatial analysis technique that utilizes ModelBuilder within the ArcGIS geoprocessing environment through suitability modeling will systematically and simultaneously analyze a geographical dataset of selected criteria. The spatial overlay analysis methodology and the multi-criteria decision analysis approach were selected to identify the most preferred locations to site a processing facility. For an optimal site selection, seven major criteria including proximity to settlements, water resources, artisanal mining sites, roads, railways, tectonic zones, and slopes were considered to establish a suitable location for a processing facility. Site characterizations and environmental considerations, incorporating identified constraints such as proximity to large scale mines, forest reserves and state lands to site an appropriate position were selected. The analysis was limited to criteria that were selected and relevant to the area under investigation. Saaty’s analytical hierarchy process was utilized to derive relative importance weights of the criteria and then a weighted linear combination technique was applied to combine the factors for determination of the degree of potential site suitability. The final map output indicates estimated potential sites identified for the establishment of a facility centre. The results obtained provide intuitive areas suitable for consideration

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The main results of this paper are twofold: the first one is a matrix theoretical result. We say that a matrix is superregular if all of its minors that are not trivially zero are nonzero. Given a a×b, a ≥ b, superregular matrix over a field, we show that if all of its rows are nonzero then any linear combination of its columns, with nonzero coefficients, has at least a−b + 1 nonzero entries. Secondly, we make use of this result to construct convolutional codes that attain the maximum possible distance for some fixed parameters of the code, namely, the rate and the Forney indices. These results answer some open questions on distances and constructions of convolutional codes posted in the literature.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Causal inference with a continuous treatment is a relatively under-explored problem. In this dissertation, we adopt the potential outcomes framework. Potential outcomes are responses that would be seen for a unit under all possible treatments. In an observational study where the treatment is continuous, the potential outcomes are an uncountably infinite set indexed by treatment dose. We parameterize this unobservable set as a linear combination of a finite number of basis functions whose coefficients vary across units. This leads to new techniques for estimating the population average dose-response function (ADRF). Some techniques require a model for the treatment assignment given covariates, some require a model for predicting the potential outcomes from covariates, and some require both. We develop these techniques using a framework of estimating functions, compare them to existing methods for continuous treatments, and simulate their performance in a population where the ADRF is linear and the models for the treatment and/or outcomes may be misspecified. We also extend the comparisons to a data set of lottery winners in Massachusetts. Next, we describe the methods and functions in the R package causaldrf using data from the National Medical Expenditure Survey (NMES) and Infant Health and Development Program (IHDP) as examples. Additionally, we analyze the National Growth and Health Study (NGHS) data set and deal with the issue of missing data. Lastly, we discuss future research goals and possible extensions.