935 resultados para Illinois. School Problems Commission


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a dan-based evolutionary approach for solving control problems. Three selected control problems, viz. linear-quadratic, harvest, and push-cart problems, are solved using the proposed approach. Results are compared with those of the evolutionary programming (EP) approach. In most of the cases, the proposed approach is successful in obtaining (near) optimal solutions for these selected problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper looks at the complexity of four different incremental problems. The following are the problems considered: (1) Interval partitioning of a flow graph (2) Breadth first search (BFS) of a directed graph (3) Lexicographic depth first search (DFS) of a directed graph (4) Constructing the postorder listing of the nodes of a binary tree. The last problem arises out of the need for incrementally computing the Sethi-Ullman (SU) ordering [1] of the subtrees of a tree after it has undergone changes of a given type. These problems are among those that claimed our attention in the process of our designing algorithmic techniques for incremental code generation. BFS and DFS have certainly numerous other applications, but as far as our work is concerned, incremental code generation is the common thread linking these problems. The study of the complexity of these problems is done from two different perspectives. In [2] is given the theory of incremental relative lower bounds (IRLB). We use this theory to derive the IRLBs of the first three problems. Then we use the notion of a bounded incremental algorithm [4] to prove the unboundedness of the fourth problem with respect to the locally persistent model of computation. Possibly, the lower bound result for lexicographic DFS is the most interesting. In [5] the author considers lexicographic DFS to be a problem for which the incremental version may require the recomputation of the entire solution from scratch. In that sense, our IRLB result provides further evidence for this possibility with the proviso that the incremental DFS algorithms considered be ones that do not require too much of preprocessing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given an undirected unweighted graph G = (V, E) and an integer k ≥ 1, we consider the problem of computing the edge connectivities of all those (s, t) vertex pairs, whose edge connectivity is at most k. We present an algorithm with expected running time Õ(m + nk3) for this problem, where |V| = n and |E| = m. Our output is a weighted tree T whose nodes are the sets V1, V2,..., V l of a partition of V, with the property that the edge connectivity in G between any two vertices s ε Vi and t ε Vj, for i ≠ j, is equal to the weight of the lightest edge on the path between Vi and Vj in T. Also, two vertices s and t belong to the same Vi for any i if and only if they have an edge connectivity greater than k. Currently, the best algorithm for this problem needs to compute all-pairs min-cuts in an O(nk) edge graph; this takes Õ(m + n5/2kmin{k1/2, n1/6}) time. Our algorithm is much faster for small values of k; in fact, it is faster whenever k is o(n5/6). Our algorithm yields the useful corollary that in Õ(m + nc3) time, where c is the size of the global min-cut, we can compute the edge connectivities of all those pairs of vertices whose edge connectivity is at most αc for some constant α. We also present an Õ(m + n) Monte Carlo algorithm for the approximate version of this problem. This algorithm is applicable to weighted graphs as well. Our algorithm, with some modifications, also solves another problem called the minimum T-cut problem. Given T ⊆ V of even cardinality, we present an Õ(m + nk3) algorithm to compute a minimum cut that splits T into two odd cardinality components, where k is the size of this cut.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, several known computational solutions are readily obtained in a very natural way for the linear regulator, fixed end-point and servo-mechanism problems using a certain frame-work from scattering theory. The relationships between the solutions to the linear regulator problem with different terminal costs and the interplay between the forward and backward equations have enabled a concise derivation of the partitioned equations, the forward-backward equations, and Chandrasekhar equations for the problem. These methods have been extended to the fixed end-point, servo, and tracking problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reduction of carbon emissions is of paramount importance in the context of global warming and climate change. Countries and global companies are now engaged in understanding systematic ways of solving carbon economics problems, aimed ultimately at achieving well defined emission targets. This paper proposes mechanism design as an approach to solving carbon economics problems. The paper first introduces carbon economics issues in the world today and next focuses on carbon economics problems facing global industries. The paper identifies four problems faced by global industries: carbon credit allocation (CCA), carbon credit buying (CCB), carbon credit selling (CCS), and carbon credit exchange (CCE). It is argued that these problems are best addressed as mechanism design problems. The discipline of mechanism design is founded on game theory and is concerned with settings where a social planner faces the problem of aggregating the announced preferences of multiple agents into a collective decision, when the actual preferences are not known publicly. The paper provides an overview of mechanism design and presents the challenges involved in designing mechanisms with desirable properties. To illustrate the application of mechanism design in carbon economics,the paper describes in detail one specific problem, the carbon credit allocation problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transductive SVM (TSVM) is a well known semi-supervised large margin learning method for binary text classification. In this paper we extend this method to multi-class and hierarchical classification problems. We point out that the determination of labels of unlabeled examples with fixed classifier weights is a linear programming problem. We devise an efficient technique for solving it. The method is applicable to general loss functions. We demonstrate the value of the new method using large margin loss on a number of multi-class and hierarchical classification datasets. For maxent loss we show empirically that our method is better than expectation regularization/constraint and posterior regularization methods, and competitive with the version of entropy regularization method which uses label constraints.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Exact Cover problem takes a universe U of n elements, a family F of m subsets of U and a positive integer k, and decides whether there exists a subfamily(set cover) F' of size at most k such that each element is covered by exactly one set. The Unique Cover problem also takes the same input and decides whether there is a subfamily F' subset of F such that at least k of the elements F' covers are covered uniquely(by exactly one set). Both these problems are known to be NP-complete. In the parameterized setting, when parameterized by k, Exact Cover is W1]-hard. While Unique Cover is FPT under the same parameter, it is known to not admit a polynomial kernel under standard complexity-theoretic assumptions. In this paper, we investigate these two problems under the assumption that every set satisfies a given geometric property Pi. Specifically, we consider the universe to be a set of n points in a real space R-d, d being a positive integer. When d = 2 we consider the problem when. requires all sets to be unit squares or lines. When d > 2, we consider the problem where. requires all sets to be hyperplanes in R-d. These special versions of the problems are also known to be NP-complete. When parameterizing by k, the Unique Cover problem has a polynomial size kernel for all the above geometric versions. The Exact Cover problem turns out to be W1]-hard for squares, but FPT for lines and hyperplanes. Further, we also consider the Unique Set Cover problem, which takes the same input and decides whether there is a set cover which covers at least k elements uniquely. To the best of our knowledge, this is a new problem, and we show that it is NP-complete (even for the case of lines). In fact, the problem turns out to be W1]-hard in the abstract setting, when parameterized by k. However, when we restrict ourselves to the lines and hyperplanes versions, we obtain FPT algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We give an overview of recent results and techniques in parameterized algorithms for graph modification problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we study the well-known r-DIMENSIONAL k-MATCHING ((r, k)-DM), and r-SET k-PACKING ((r, k)-SP) problems. Given a universe U := U-1 ... U-r and an r-uniform family F subset of U-1 x ... x U-r, the (r, k)-DM problem asks if F admits a collection of k mutually disjoint sets. Given a universe U and an r-uniform family F subset of 2(U), the (r, k)-SP problem asks if F admits a collection of k mutually disjoint sets. We employ techniques based on dynamic programming and representative families. This leads to a deterministic algorithm with running time O(2.851((r-1)k) .vertical bar F vertical bar. n log(2)n . logW) for the weighted version of (r, k)-DM, where W is the maximum weight in the input, and a deterministic algorithm with running time O(2.851((r-0.5501)k).vertical bar F vertical bar.n log(2) n . logW) for the weighted version of (r, k)-SP. Thus, we significantly improve the previous best known deterministic running times for (r, k)-DM and (r, k)-SP and the previous best known running times for their weighted versions. We rely on structural properties of (r, k)-DM and (r, k)-SP to develop algorithms that are faster than those that can be obtained by a standard use of representative sets. Incorporating the principles of iterative expansion, we obtain a better algorithm for (3, k)-DM, running in time O(2.004(3k).vertical bar F vertical bar . n log(2)n). We believe that this algorithm demonstrates an interesting application of representative families in conjunction with more traditional techniques. Furthermore, we present kernels of size O(e(r)r(k-1)(r) logW) for the weighted versions of (r, k)-DM and (r, k)-SP, improving the previous best known kernels of size O(r!r(k-1)(r) logW) for these problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the POSSIBLE WINNER problem in computational social choice theory, we are given a set of partial preferences and the question is whether a distinguished candidate could be made winner by extending the partial preferences to linear preferences. Previous work has provided, for many common voting rules, fixed parameter tractable algorithms for the POSSIBLE WINNER problem, with number of candidates as the parameter. However, the corresponding kernelization question is still open and in fact, has been mentioned as a key research challenge 10]. In this paper, we settle this open question for many common voting rules. We show that the POSSIBLE WINNER problem for maximin, Copeland, Bucklin, ranked pairs, and a class of scoring rules that includes the Borda voting rule does not admit a polynomial kernel with the number of candidates as the parameter. We show however that the COALITIONAL MANIPULATION problem which is an important special case of the POSSIBLE WINNER problem does admit a polynomial kernel for maximin, Copeland, ranked pairs, and a class of scoring rules that includes the Borda voting rule, when the number of manipulators is polynomial in the number of candidates. A significant conclusion of our work is that the POSSIBLE WINNER problem is harder than the COALITIONAL MANIPULATION problem since the COALITIONAL MANIPULATION problem admits a polynomial kernel whereas the POSSIBLE WINNER problem does not admit a polynomial kernel. (C) 2015 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ENGLISH: The accuracy and precision of dolphin school size estimates based on aerial photograph counts were examined using data collected on recent aerial and ship surveys. These estimates were found to be accurate during a 1979research cruise aboard a tuna purse-seiner; dolphin schools were photographed from the ship’s helicopter, encircled with the purse-seine, and then counted as the dolphins were released from the net. A comparison of photographic estimates with these counts indicated that the relationship was fairly close and gave no indication of significantly differing from 1:1. During a 1980 aerial study, photographic estimates from different schools, passes, and camera formats were compared and were found to be quite precise with a standard deviation of approximately 60/0 of school size. Photographic estimates were also compared with estimates made by aerial observers. Most observers tended to underestimate school size, particularly for large schools. The variability among observers was high, indicating that observers should be individually calibrated. SPANISH: Se examinó la exactitud y la precisión de las estimaciones de la magnitud de los cardúmenes de delfines basadas en el cálculo de las fotografías aéreas, usando los datos obtenidos en los últimos reconocimientos aéreos y de los barcos. En 1979, durante un crucero de investigación en un cerquero atunero, se encontró que estas estimaciones eran acertadas; se fotografiaron los cardúmenes de delfines desde un helicóptero del barco, cercados con la red y luego se contaron a medida que se libraban los delfines de la red. Una comparación de las estimaciones fotográficas con estos cálculos indicó que la relación era bastante aproximada y no hubo indicación que se diferenció significativamente de la razón 1:1. Durante un estudio aéreo en 1980, se compararon las estimaciones fotográficas de diferentes del cardúmenes, en los pases y los formatos de las cámaras y se encontró que eran bastante precisos, con una desviación normal de cerca del 60/0 de la magnitud cardumen. Se compararon también las estimaciones fotográficas con las estimaciones realizadas por los observadores aéreos. La mayoría de los observadores tienden a subestimar la magnitud de los cardúmenes, especialmente los cardúmenes grandes. La variabilidad entre los observadores fue elevada, lo que indica que se deben calibrar individualmente los datos de observadores. (PDF contains 39 pages.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 9th International Test Commission Conference (ITC) took place at the Miramar Palace in San Sebastian, Spain, between the 2nd and 5th of July, 2014. The Conference was titled, “Global and Local Challenges for Best Practices in Assessment.” The International Test Commission, ITC (www.intestcom.org), is an association of national psychological associations, test commissions, publishers, and other organizations, as well as individuals who are committed to the promotion of effective testing and assessment policies and to the proper development, evaluation, and uses of educational and psychological instruments. The ITC facilitates the exchange of information among members and stimulates their cooperation on problems related to the construction, distribution, and uses of psychological and educational tests and other psychodiagnostic tools. This volume contains the abstracts of the contributions presented at the 9th International Test Commission Conference. The four themes of the Conference were closely linked to the goals of the ITC: - Challenges and Opportunities in International Assessment. - Application of New Technoloogies and New Psychometric Models in Testing. - Standards and Guidelines for Best Testing Practices. - Testing in Multilingual and Multicultural Contexts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

John Nathan Cobb (1868–1930) became the founding Director of the College of Fisheries, University of Washington, Seattle, in 1919 without the benefit of a college education. An inquisitive and ambitious man, he began his career in the newspaper business and was introduced to commercial fisheries when he joined the U.S. Fish Commission (USFC) in 1895 as a clerk, and he was soon promoted to a “Field Agent” in the Division of Statistics, Washington, D.C. During the next 17 years, Cobb surveyed commercial fisheries from Maine to Florida, Hawaii, the Pacific Northwest, and Alaska for the USFC and its successor, the U.S. Bureau of Fisheries. In 1913, he became editor of the prominent west coast trade magazine, Pacific Fisherman, of Seattle, Wash., where he became known as a leading expert on the fisheries of the Pacific Northwest. He soon joined the campaign, led by his employer, to establish the nation’s first fisheries school at the University of Washington. After a brief interlude (1917–1918) with the Alaska Packers Association in San Francisco, Calif., he was chosen as the School’s founding director in 1919. Reflecting his experience and mindset, as well as the University’s apparent initial desire, Cobb established the College of Fisheries primarily as a training ground for those interested in applied aspects of the commercial fishing industry. Cobb attracted sufficient students, was a vigorous spokesman for the College, and had ambitions plans for expansion of the school’s faculty and facilities. He became aware that the College was not held in high esteem by his faculty colleagues or by the University administration because of the school’s failure to emphasize scholastic achievement, and he attempted to correct this deficiency. Cobb became ill with heart problems in 1929 and died on 13 January 1930. The University soon thereafter dissolved the College and dismissed all but one of its faculty. A Department of Fisheries, in the College of Science, was then established in 1930 and was led by William Francis Thompson (1888–1965), who emphasized basic science and fishery biology. The latter format continues to the present in the Department’s successor, The School of Aquatic Fisheries and Science.