958 resultados para Maximum Set Splitting Problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research has identified a number of putative risk factors that places adolescents at incrementally higher risk for involvement in alcohol and other drug (AOD) use and sexual risk behaviors (SRBs). Such factors include personality characteristics such as sensation-seeking, cognitive factors such as positive expectancies and inhibition conflict as well as peer norm processes. The current study was guided by a conceptual perspective that support the notion that an integrative framework that includes multi-level factors has significant explanatory value for understanding processes associated with the co-occurrence of AOD use and sexual risk behavior outcomes. This study evaluated simultaneously the mediating role of AOD-sex related expectancies and inhibition conflict on antecedents of AOD use and SRBs including sexual sensation-seeking and peer norms for condom use. The sample was drawn from the Enhancing My Personal Options While Evaluating Risk (EMPOWER: Jonathan Tubman, PI), data set (N = 396; aged 12-18 years). Measures used in the study included Sexual Sensation-Seeking Scale, Inhibition Conflict for Condom Use, Risky Sex Scale. All relevant measures had well-documented psychometric properties. A global assessment of alcohol, drug use and sexual risk behaviors was used. Results demonstrated that AOD-sex related expectancies mediated the influence of sexual sensation-seeking on the co-occurrence of alcohol and other drug use and sexual risk behaviors. The evaluation of the integrative model also revealed that sexual sensation-seeking was positively associated with peer norms for condom use. Also, peer norms predicted inhibition conflict among this sample of multi-problem youth. This dissertation research identified mechanisms of risk and protection associated with the co-occurrence of AOD use and SRBs among a multi-problem sample of adolescents receiving treatment for alcohol or drug use and related problems. This study is informative for adolescent-serving programs that address those individual and contextual characteristics that enhance treatment efficacy and effectiveness among adolescents receiving substance use and related problems services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

*Designated as an exemplary master's project for 2015-16*

The American approach to disparities in educational achievement is deficit focused and based on false assumptions of equal educational opportunity and social mobility. The labels attached to children served by compensatory early childhood education programs have evolved, e.g., from “culturally deprived” into “at-risk” for school failure, yet remain rooted in deficit discourses and ideology. Drawing on multiple bodies of literature, this thesis analyzes the rhetoric of compensatory education as viewed through the conceptual lens of the deficit thinking paradigm, in which school failure is attributed to perceived genetic, cultural, or environmental deficiencies, rather than institutional and societal inequalities. With a focus on the evolution of deficit thinking, the thesis begins with late 19th century U.S. early childhood education as it set the stage for more than a century of compensatory education responses to the needs of children, inadequacies of immigrant and minority families, and threats to national security. Key educational research and publications on genetic-, cultural-, and environmental-deficits are aligned with trends in achievement gaps and compensatory education initiatives, beginning mid-20th century following the Brown vs Board declaration of 1954 and continuing to the present. This analysis then highlights patterns in the oppression, segregation, and disenfranchisement experienced by low-income and minority students, largely ignored within the mainstream compensatory education discourse. This thesis concludes with a heterodox analysis of how the deficit thinking paradigm is dependent on assumptions of equal educational opportunity and social mobility, which helps perpetuate the cycle of school failure amid larger social injustices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

My dissertation has three chapters which develop and apply microeconometric tech- niques to empirically relevant problems. All the chapters examines the robustness issues (e.g., measurement error and model misspecification) in the econometric anal- ysis. The first chapter studies the identifying power of an instrumental variable in the nonparametric heterogeneous treatment effect framework when a binary treat- ment variable is mismeasured and endogenous. I characterize the sharp identified set for the local average treatment effect under the following two assumptions: (1) the exclusion restriction of an instrument and (2) deterministic monotonicity of the true treatment variable in the instrument. The identification strategy allows for general measurement error. Notably, (i) the measurement error is nonclassical, (ii) it can be endogenous, and (iii) no assumptions are imposed on the marginal distribution of the measurement error, so that I do not need to assume the accuracy of the measure- ment. Based on the partial identification result, I provide a consistent confidence interval for the local average treatment effect with uniformly valid size control. I also show that the identification strategy can incorporate repeated measurements to narrow the identified set, even if the repeated measurements themselves are endoge- nous. Using the the National Longitudinal Study of the High School Class of 1972, I demonstrate that my new methodology can produce nontrivial bounds for the return to college attendance when attendance is mismeasured and endogenous.

The second chapter, which is a part of a coauthored project with Federico Bugni, considers the problem of inference in dynamic discrete choice problems when the structural model is locally misspecified. We consider two popular classes of estimators for dynamic discrete choice models: K-step maximum likelihood estimators (K-ML) and K-step minimum distance estimators (K-MD), where K denotes the number of policy iterations employed in the estimation problem. These estimator classes include popular estimators such as Rust (1987)’s nested fixed point estimator, Hotz and Miller (1993)’s conditional choice probability estimator, Aguirregabiria and Mira (2002)’s nested algorithm estimator, and Pesendorfer and Schmidt-Dengler (2008)’s least squares estimator. We derive and compare the asymptotic distributions of K- ML and K-MD estimators when the model is arbitrarily locally misspecified and we obtain three main results. In the absence of misspecification, Aguirregabiria and Mira (2002) show that all K-ML estimators are asymptotically equivalent regardless of the choice of K. Our first result shows that this finding extends to a locally misspecified model, regardless of the degree of local misspecification. As a second result, we show that an analogous result holds for all K-MD estimators, i.e., all K- MD estimator are asymptotically equivalent regardless of the choice of K. Our third and final result is to compare K-MD and K-ML estimators in terms of asymptotic mean squared error. Under local misspecification, the optimally weighted K-MD estimator depends on the unknown asymptotic bias and is no longer feasible. In turn, feasible K-MD estimators could have an asymptotic mean squared error that is higher or lower than that of the K-ML estimators. To demonstrate the relevance of our asymptotic analysis, we illustrate our findings using in a simulation exercise based on a misspecified version of Rust (1987) bus engine problem.

The last chapter investigates the causal effect of the Omnibus Budget Reconcil- iation Act of 1993, which caused the biggest change to the EITC in its history, on unemployment and labor force participation among single mothers. Unemployment and labor force participation are difficult to define for a few reasons, for example, be- cause of marginally attached workers. Instead of searching for the unique definition for each of these two concepts, this chapter bounds unemployment and labor force participation by observable variables and, as a result, considers various competing definitions of these two concepts simultaneously. This bounding strategy leads to partial identification of the treatment effect. The inference results depend on the construction of the bounds, but they imply positive effect on labor force participa- tion and negligible effect on unemployment. The results imply that the difference- in-difference result based on the BLS definition of unemployment can be misleading

due to misclassification of unemployment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work explores the use of statistical methods in describing and estimating camera poses, as well as the information feedback loop between camera pose and object detection. Surging development in robotics and computer vision has pushed the need for algorithms that infer, understand, and utilize information about the position and orientation of the sensor platforms when observing and/or interacting with their environment.

The first contribution of this thesis is the development of a set of statistical tools for representing and estimating the uncertainty in object poses. A distribution for representing the joint uncertainty over multiple object positions and orientations is described, called the mirrored normal-Bingham distribution. This distribution generalizes both the normal distribution in Euclidean space, and the Bingham distribution on the unit hypersphere. It is shown to inherit many of the convenient properties of these special cases: it is the maximum-entropy distribution with fixed second moment, and there is a generalized Laplace approximation whose result is the mirrored normal-Bingham distribution. This distribution and approximation method are demonstrated by deriving the analytical approximation to the wrapped-normal distribution. Further, it is shown how these tools can be used to represent the uncertainty in the result of a bundle adjustment problem.

Another application of these methods is illustrated as part of a novel camera pose estimation algorithm based on object detections. The autocalibration task is formulated as a bundle adjustment problem using prior distributions over the 3D points to enforce the objects' structure and their relationship with the scene geometry. This framework is very flexible and enables the use of off-the-shelf computational tools to solve specialized autocalibration problems. Its performance is evaluated using a pedestrian detector to provide head and foot location observations, and it proves much faster and potentially more accurate than existing methods.

Finally, the information feedback loop between object detection and camera pose estimation is closed by utilizing camera pose information to improve object detection in scenarios with significant perspective warping. Methods are presented that allow the inverse perspective mapping traditionally applied to images to be applied instead to features computed from those images. For the special case of HOG-like features, which are used by many modern object detection systems, these methods are shown to provide substantial performance benefits over unadapted detectors while achieving real-time frame rates, orders of magnitude faster than comparable image warping methods.

The statistical tools and algorithms presented here are especially promising for mobile cameras, providing the ability to autocalibrate and adapt to the camera pose in real time. In addition, these methods have wide-ranging potential applications in diverse areas of computer vision, robotics, and imaging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of immersing a simply connected surface with a prescribed shape operator is discussed. I show that, aside from some special degenerate cases, such as when the shape operator can be realized by a surface with one family of principal curves being geodesic, the space of such realizations is a convex set in an affine space of dimension at most 3. The cases where this maximum dimension of realizability is achieved are analyzed and it is found that there are two such families of shape operators, one depending essentially on three arbitrary functions of one variable and another depending essentially on two arbitrary functions of one variable. The space of realizations is discussed in each case, along with some of their remarkable geometric properties. Several explicit examples are constructed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A multiproxy data set of an AMS radiocarbon dated 46 cm long sediment core from the continental margin off western Svalbard reveals multidecadal climatic variability during the past two millennia. Investigation of planktic and benthic stable isotopes, planktic foraminiferal fauna, and lithogenic parameters aims to unveil the Atlantic Water advection to the eastern Fram Strait by intensity, temperatures, and salinities. Atlantic Water has been continuously present at the site over the last 2,000 years. Superimposed on the increase in sea ice/icebergs, a strengthened intensity of Atlantic Water inflow and seasonal ice-free conditions were detected at ~ 1000 to 1200 AD, during the well-known Medieval Climate Anomaly (MCA). However, temperatures of the MCA never exceeded those of the 20th century. Since ~ 1400 AD significantly higher portions of ice rafted debris and high planktic foraminifer fluxes suggest that the site was located in the region of a seasonal highly fluctuating sea ice margin. A sharp reduction in planktic foraminifer fluxes around 800 AD and after 1730 AD indicates cool summer conditions with major influence of sea ice/icebergs. High amounts of the subpolar planktic foraminifer species Turborotalia quinqueloba in size fraction 150-250 µm indicate strengthened Atlantic Water inflow to the eastern Fram Strait already after ~ 1860 AD. Nevertheless surface conditions stayed cold well into the 20th century indicated by low planktic foraminiferal fluxes. Most likely at the beginning of the 20th century, cold conditions of the terminating Little Ice Age period persisted at the surface whereas warm and saline Atlantic Water already strengthened, hereby subsiding below the cold upper mixed layer. Surface sediments with high abundances of subpolar planktic foraminifers indicate a strong inflow of Atlantic Water providing seasonal ice-free conditions in the eastern Fram Strait during the last few decades.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visual cluster analysis provides valuable tools that help analysts to understand large data sets in terms of representative clusters and relationships thereof. Often, the found clusters are to be understood in context of belonging categorical, numerical or textual metadata which are given for the data elements. While often not part of the clustering process, such metadata play an important role and need to be considered during the interactive cluster exploration process. Traditionally, linked-views allow to relate (or loosely speaking: correlate) clusters with metadata or other properties of the underlying cluster data. Manually inspecting the distribution of metadata for each cluster in a linked-view approach is tedious, specially for large data sets, where a large search problem arises. Fully interactive search for potentially useful or interesting cluster to metadata relationships may constitute a cumbersome and long process. To remedy this problem, we propose a novel approach for guiding users in discovering interesting relationships between clusters and associated metadata. Its goal is to guide the analyst through the potentially huge search space. We focus in our work on metadata of categorical type, which can be summarized for a cluster in form of a histogram. We start from a given visual cluster representation, and compute certain measures of interestingness defined on the distribution of metadata categories for the clusters. These measures are used to automatically score and rank the clusters for potential interestingness regarding the distribution of categorical metadata. Identified interesting relationships are highlighted in the visual cluster representation for easy inspection by the user. We present a system implementing an encompassing, yet extensible, set of interestingness scores for categorical metadata, which can also be extended to numerical metadata. Appropriate visual representations are provided for showing the visual correlations, as well as the calculated ranking scores. Focusing on clusters of time series data, we test our approach on a large real-world data set of time-oriented scientific research data, demonstrating how specific interesting views are automatically identified, supporting the analyst discovering interesting and visually understandable relationships.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

If C is a stable model category with a monoidal product then the set of homotopy classes of self-maps of the unit forms a commutative ring, [S,S]C. An idempotent e of this ring will split the homotopy category: [X,Y]C≅e[X,Y]C⊕(1−e)[X,Y]C. We prove that provided the localised model structures exist, this splitting of the homotopy category comes from a splitting of the model category, that is, C is Quillen equivalent to LeSC×L(1−e)SC and [X,Y]LeSC≅e[X,Y]C. This Quillen equivalence is strong monoidal and is symmetric when the monoidal product of C is.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an integer programming model for developing optimal shift schedules while allowing extensive flexibility in terms of alternate shift starting times, shift lengths, and break placement. The model combines the work of Moondra (1976) and Bechtold and Jacobs (1990) by implicitly matching meal breaks to implicitly represented shifts. Moreover, the new model extends the work of these authors to enable the scheduling of overtime and the scheduling of rest breaks. We compare the new model to Bechtold and Jacobs' model over a diverse set of 588 test problems. The new model generates optimal solutions more rapidly, solves problems with more shift alternatives, and does not generate schedules violating the operative restrictions on break timing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are fundamental spatial and temporal disconnects between the specific policies that have been crafted to address our wildfire challenges. The biophysical changes in fuels, wildfire behavior, and climate have created a new set of conditions for which our wildfire governance system is poorly suited to address. To address these challenges, a reorientation of goals is needed to focus on creating an anticipatory wildfire governance system focused on social and ecological resilience. Key characteristics of this system could include the following: (1) not taking historical patterns as givens; (2) identifying future social and ecological thresholds of concern; (3) embracing diversity/heterogeneity as principles in ecological and social responses; and (4) incorporating learning among different scales of actors to create a scaffolded learning system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a second-order variational problem depending on the covariant acceleration, which is related to the notion of Riemannian cubic polynomials. This problem and the corresponding optimal control problem are described in the context of higher order tangent bundles using geometric tools. The main tool, a presymplectic variant of Pontryagin’s maximum principle, allows us to study the dynamics of the control problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We say that a polygon inscribed in the circle is asymmetric if it contains no two antipodal points being the endpoints of a diameter. Given n diameters of a circle and a positive integer k < n, this paper addresses the problem of computing a maximum area asymmetric k-gon having as vertices k < n endpoints of the given diameters. The study of this type of polygons is motivated by ethnomusiciological applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Elasticity is one of the most known capabilities related to cloud computing, being largely deployed reactively using thresholds. In this way, maximum and minimum limits are used to drive resource allocation and deallocation actions, leading to the following problem statements: How can cloud users set the threshold values to enable elasticity in their cloud applications? And what is the impact of the application’s load pattern in the elasticity? This article tries to answer these questions for iterative high performance computing applications, showing the impact of both thresholds and load patterns on application performance and resource consumption. To accomplish this, we developed a reactive and PaaS-based elasticity model called AutoElastic and employed it over a private cloud to execute a numerical integration application. Here, we are presenting an analysis of best practices and possible optimizations regarding the elasticity and HPC pair. Considering the results, we observed that the maximum threshold influences the application time more than the minimum one. We concluded that threshold values close to 100% of CPU load are directly related to a weaker reactivity, postponing resource reconfiguration when its activation in advance could be pertinent for reducing the application runtime.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Gulf of Carpentaria Finfish Trawl Fishery operates under developmental permits and harvests five main tropical snapper species. The fishery operates in eastern Gulf of Carpentaria waters and is managed by Fisheries Queensland on behalf of the Queensland Fishery Joint Authority. For the years 2004–2014, the fishery Total Allowable Commercial Catch (TACC) was fixed at 1250 t and substantially under-filled. In 2011 new stock analyses were published for the fishery. Results were presented to industry including the estimated equilibrium maximum sustainable yield (MSY) of 450 t for east Gulf of Carpentaria waters. The MSY value represented the maximum average combined species harvest that can be taken long-term; combining MSY harvests of the five main species. For the 2015 calendar year, a revised 450 t harvest quota was set for Crimson Snapper, Saddletail Snapper, Red Emperor and other Emperor species; plus a tonnage allowance for other permitted species. The revised quota tonnage represented a considerable reduction from the 1250 t set in previous years. Industry raised questions about not understanding how the MSY was arrived at and why it was less than early 1990s yield estimates. The purpose of this report is to explain the MSY estimates for east Gulf of Carpentaria waters. The 450 t MSY represents at present the best estimate available and is consistent with pre-2011 estimates.