906 resultados para Escalonamento multidimensional
Resumo:
This thesis conducts a formal study of the poetry of Gloria Anzaldúa and Lorna Dee Cervantes, placing their work in dialogue with genre and style. These two Chicana poets are exemplary of politicised experimentation with poetics, underpinned by a keen awareness of the rich history of form, genre and style. In the work of each poet, two poetic modes are examined: one traditional, and one experimental. Anzaldúa’s uses of the dramatic monologue as a border genre, and her construction of [auto]poetics, stemming from her multi-genre, autobiographical approach to writing, are considered. Cervantes’s complex approach to the construction of docupoetics that achieves a depth of field in terms of merging a multidimensional approach to aesthetics with highly politicised transnational content, as well as her engagement with the longstanding poetic of elegy via various formal points of entry, is investigated. These poetic modes are primarily explored via close readings, supported by a multidisciplinary framework that includes Anzaldúa’s feminist theories of identity and writing, abjection theory, postcolonialism, and transnationalism. Overall, these four key areas demonstrate the ways in which aesthetics is a crucial consideration in the exploration of the broader issues of content and context in Chicana poetry.
Resumo:
The universality versus culture specificity of quantitative evaluations (negative-positive) of 40 events in world history was addressed using World History Survey data collected from 5,800 university students in 30 countries/societies. Multidimensional scaling using generalized procrustean analysis indicated poor fit of data from the 30 countries to an overall mean configuration, indicating lack of universal agreement as to the associational meaning of events in world history. Hierarchical cluster analysis identified one Western and two non-Western country clusters for which adequate multidimensional fit was obtained after item deletions. A two-dimensional solution for the three country clusters was identified, where the primary dimension was historical calamities versus progress and a weak second dimension was modernity versus resistance to modernity. Factor analysis further reduced the item inventory to identify a single concept with structural equivalence across cultures, Historical Calamities, which included man-made and natural, intentional and unintentional, predominantly violent but also nonviolent calamities. Less robust factors were tentatively named as Historical Progress and Historical Resistance to Oppression. Historical Calamities and Historical Progress were at the individual level both significant and independent predictors of willingness to fight for one’s country in a hierarchical linear model that also identified significant country-level variation in these relationships. Consensus around calamity but disagreement as to what constitutes historical progress is discussed in relation to the political culture of nations and lay perceptions of history as catastrophe.
Resumo:
One of the fundamental findings in the congressional literature is that one or sometimes two dimensions can successfully describe roll-call voting. In this paper we investigate if we can reach the same conclusions about low dimensionality when we divide the roll-call agenda into subsets of relatively homogeneous subject matter. We are primarily interested in the degree to which the same ordering of representatives is yielded across these different groups of votes. To conduct our analysis we focus on all roll calls on the 13 annual appropriations bills across eight congresses. When we concentrate on these smaller issue areas, we find that voting is multidimensional and members do not vote in a consistent ideological fashion across all issue areas. Copyright © Southern Political Science Association 2010.
Resumo:
Compressive sampling enables signal reconstruction using less than one measurement per reconstructed signal value. Compressive measurement is particularly useful in generating multidimensional images from lower dimensional data. We demonstrate single frame 3D tomography from 2D holographic data.
Resumo:
Cognitive-emotional distinctiveness (CED), the extent to which an individual separates emotions from an event in the cognitive representation of the event, was explored in four studies. CED was measured using a modified multidimensional scaling procedure. The first study found that lower levels of CED in memories of the September 11 terrorist attacks predicted greater frequency of intrusive thoughts about the attacks. The second study revealed that CED levels are higher in negative events, in comparison to positive events and that low CED levels in emotionally intense negative events are associated with a pattern of greater event-related distress. The third study replicated the findings from the previous study when examining CED levels in participants' memories of the 2004 Presidential election. The fourth study revealed that low CED in emotionally intense negative events is associated with worse mental health. We argue that CED is an adaptive and healthy coping feature of stressful memories.
Resumo:
Scheduling a set of jobs over a collection of machines to optimize a certain quality-of-service measure is one of the most important research topics in both computer science theory and practice. In this thesis, we design algorithms that optimize {\em flow-time} (or delay) of jobs for scheduling problems that arise in a wide range of applications. We consider the classical model of unrelated machine scheduling and resolve several long standing open problems; we introduce new models that capture the novel algorithmic challenges in scheduling jobs in data centers or large clusters; we study the effect of selfish behavior in distributed and decentralized environments; we design algorithms that strive to balance the energy consumption and performance.
The technically interesting aspect of our work is the surprising connections we establish between approximation and online algorithms, economics, game theory, and queuing theory. It is the interplay of ideas from these different areas that lies at the heart of most of the algorithms presented in this thesis.
The main contributions of the thesis can be placed in one of the following categories.
1. Classical Unrelated Machine Scheduling: We give the first polygorithmic approximation algorithms for minimizing the average flow-time and minimizing the maximum flow-time in the offline setting. In the online and non-clairvoyant setting, we design the first non-clairvoyant algorithm for minimizing the weighted flow-time in the resource augmentation model. Our work introduces iterated rounding technique for the offline flow-time optimization, and gives the first framework to analyze non-clairvoyant algorithms for unrelated machines.
2. Polytope Scheduling Problem: To capture the multidimensional nature of the scheduling problems that arise in practice, we introduce Polytope Scheduling Problem (\psp). The \psp problem generalizes almost all classical scheduling models, and also captures hitherto unstudied scheduling problems such as routing multi-commodity flows, routing multicast (video-on-demand) trees, and multi-dimensional resource allocation. We design several competitive algorithms for the \psp problem and its variants for the objectives of minimizing the flow-time and completion time. Our work establishes many interesting connections between scheduling and market equilibrium concepts, fairness and non-clairvoyant scheduling, and queuing theoretic notion of stability and resource augmentation analysis.
3. Energy Efficient Scheduling: We give the first non-clairvoyant algorithm for minimizing the total flow-time + energy in the online and resource augmentation model for the most general setting of unrelated machines.
4. Selfish Scheduling: We study the effect of selfish behavior in scheduling and routing problems. We define a fairness index for scheduling policies called {\em bounded stretch}, and show that for the objective of minimizing the average (weighted) completion time, policies with small stretch lead to equilibrium outcomes with small price of anarchy. Our work gives the first linear/ convex programming duality based framework to bound the price of anarchy for general equilibrium concepts such as coarse correlated equilibrium.
Resumo:
Se aplican algunas nociones teóricas del enfoque ontosemiótico del conocimiento y la instrucción matemática (Godino, Contreras, Font, 2006) al análisis de una experiencia de enseñanza del concepto de límite funcional con estudiantes de bachillerato. Los procesos de enseñanza – aprendizaje se modelizan en este marco teórico como un proceso estocástico multidimensional compuesto de seis subprocesos (epistémico, docente, discente, mediacional, cognitivo y emocional) con sus respectivas trayectorias y estados potenciales. En este trabajo centramos la atención en la dimensión epistémica mostrando algunos conflictos semióticos y limitaciones en el significado institucional implementado.
Resumo:
The growth of computer power allows the solution of complex problems related to compressible flow, which is an important class of problems in modern day CFD. Over the last 15 years or so, many review works on CFD have been published. This book concerns both mathematical and numerical methods for compressible flow. In particular, it provides a clear cut introduction as well as in depth treatment of modern numerical methods in CFD. This book is organised in two parts. The first part consists of Chapters 1 and 2, and is mainly devoted to theoretical discussions and results. Chapter 1 concerns fundamental physical concepts and theoretical results in gas dynamics. Chapter 2 describes the basic mathematical theory of compressible flow using the inviscid Euler equations and the viscous Navier–Stokes equations. Existence and uniqueness results are also included. The second part consists of modern numerical methods for the Euler and Navier–Stokes equations. Chapter 3 is devoted entirely to the finite volume method for the numerical solution of the Euler equations and covers fundamental concepts such as order of numerical schemes, stability and high-order schemes. The finite volume method is illustrated for 1-D as well as multidimensional Euler equations. Chapter 4 covers the theory of the finite element method and its application to compressible flow. A section is devoted to the combined finite volume–finite element method, and its background theory is also included. Throughout the book numerous examples have been included to demonstrate the numerical methods. The book provides a good insight into the numerical schemes, theoretical analysis, and validation of test problems. It is a very useful reference for applied mathematicians, numerical analysts, and practice engineers. It is also an important reference for postgraduate researchers in the field of scientific computing and CFD.
Resumo:
This article takes a multidimensional or biopsychosocial conception of drug dependency as its starting point. Within this analytical framework, we advocate making the intercultural dimension more visible, since it is essential for the design and implementation of integral intervention processes. We propose intercultural competence as a working model that can increase the capacities of institutions and professionals —a particularly important consideration in the case of social work— in order to effectively address the aforementioned cultural dimension. After an extensive review of the scientific literature, we have defined five processes that can contribute to strengthening an institution’s intercultural competence and four processes that can do the same for a professional’s intercultural competence. Though selected for application in the area of drug dependencies, all these processes can also prove useful in improving attention to any other kind of culturally diverse group or person.
Resumo:
The grading of crushed aggregate is carried out usually by sieving. We describe a new image-based approach to the automatic grading of such materials. The operational problem addressed is where the camera is located directly over a conveyor belt. Our approach characterizes the information content of each image, taking into account relative variation in the pixel data, and resolution scale. In feature space, we find very good class separation using a multidimensional linear classifier. The innovation in this work includes (i) introducing an effective image-based approach into this application area, and (ii) our supervised classification using wavelet entropy-based features.
Resumo:
The greatest relaxation time for an assembly of three- dimensional rigid rotators in an axially symmetric bistable potential is obtained exactly in terms of continued fractions as a sum of the zero frequency decay functions (averages of the Legendre polynomials) of the system. This is accomplished by studying the entire time evolution of the Green function (transition probability) by expanding the time dependent distribution as a Fourier series and proceeding to the zero frequency limit of the Laplace transform of that distribution. The procedure is entirely analogous to the calculation of the characteristic time of the probability evolution (the integral of the configuration space probability density function with respect to the position co-ordinate) for a particle undergoing translational diffusion in a potential; a concept originally used by Malakhov and Pankratov (Physica A 229 (1996) 109). This procedure allowed them to obtain exact solutions of the Kramers one-dimensional translational escape rate problem for piecewise parabolic potentials. The solution was accomplished by posing the problem in terms of the appropriate Sturm-Liouville equation which could be solved in terms of the parabolic cylinder functions. The method (as applied to rotational problems and posed in terms of recurrence relations for the decay functions, i.e., the Brinkman approach c.f. Blomberg, Physica A 86 (1977) 49, as opposed to the Sturm-Liouville one) demonstrates clearly that the greatest relaxation time unlike the integral relaxation time which is governed by a single decay function (albeit coupled to all the others in non-linear fashion via the underlying recurrence relation) is governed by a sum of decay functions. The method is easily generalized to multidimensional state spaces by matrix continued fraction methods allowing one to treat non-axially symmetric potentials, where the distribution function is governed by two state variables. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
There is abundant empirical evidence on the negative relationship between welfare effort and poverty. However, poverty indicators traditionally used have been representative of the monetary approach, excluding its multidimensional reality from the analysis. Using three regression techniques for the period 1990-2010 and controlling for demographic and cyclical factors, this paper examines the relationship between social spending per capita —as the indicator of welfare effort— and poverty in up to 21 countries of the region. The proportion of the population with an income below its national basic basket of goods and services (PM1) and the proportion of population with an income below 50% of the median income per capita (PM2) were the two poverty indicators considered from the monetarist approach to measure poverty. From the capability approach the proportion of the population with food inadequacy (PC1) and the proportion of the population without access to improved water sources or sanitation facilities (PC2) were used. The fi ndings confi rm that social spending is actually useful to explain changes in poverty (PM1, PC1 and PC2), as there is a high negative and signifi cant correlation between the variables before and after controlling for demographic and cyclical factors. In two regression techniques, social spending per capita did not show a negative relationship with the PM2. Countries with greater welfare effort for the period 1990-2010 were not necessarily those with the lowest level of poverty. Ultimately social spending per capita was more useful to explain changes in poverty from the capability approach.
Resumo:
El rol desempeñado por la opinión pública en el desarrollo de la política criminal actual justifica el incremento de investigaciones destinadas a evaluar las actitudes de los ciudadanos hacia el castigo. No obstante, los avances en este ámbito han sido limitados debido a la utilización de rudimentarios instrumentos de medida. Por ello, el presente trabajo tiene como propósito explorar el efecto que generan en la opinión ciudadana ciertas variables referidas al hecho delictivo y al infractor, precisando su contribución relativa y la interacción existente entre ellas. Para satisfacer este objetivo se recurrió a un diseño factorial de la encuesta, creando una población de 256 casos-escenario fruto de la combinación de cuatro factores: la edad del joven, su historial delictivo, el grado de implicación en el hecho y el tipo de delito cometido. Los mismos fueron distribuidos en grupos de ocho casos ordenados aleatoriamente y fueron suministrados a 32 sujetos. Posteriormente se aplicaron análisis de regresión logística binaria. Los resultados obtenidos revelan que la naturaleza violenta de los hechos, la implicación activa de los jóvenes y el historial delictivo son predictores importantes de las condenas punitivas. Sin embargo la edad, una variable fundamental en la configuración de la justicia juvenil, no resulta significativa. De este modo, el trabajo muestra el potencial explicativo de este conjunto de factores y debate sus implicaciones teóricas y metodológicas para la investigación futura en este terreno.
Resumo:
This article presents a novel classification of wavelet neural networks based on the orthogonality/non-orthogonality of neurons and the type of nonlinearity employed. On the basis of this classification different network types are studied and their characteristics illustrated by means of simple one-dimensional nonlinear examples. For multidimensional problems, which are affected by the curse of dimensionality, the idea of spherical wavelet functions is considered. The behaviour of these networks is also studied for modelling of a low-dimension map.
Resumo:
A scheme to obtain brilliant x-ray sources by coherent reflection of a counter-propagating pulse from laser-driven dense electron sheets is theoretically and numerically investigated in a self-consistent manner. A radiation pressure acceleration model for the dynamics of the electron sheets blown out from laser-irradiated ultrathin foils is developed and verified by PIC simulations. The first multidimensional and integral demonstration of the scheme by 2D PIC simulations is presented. It is found that the reflected pulse undergoes Doppler-upshift by a factor 4?z2, where ?z = (1- vz2/c2)-1/2 is the effective Lorentz factor of the electron sheet al ong its normal direction. Meanwhile the pulse electric field is intensified by a factor depending on the electron density of the sheet in its moving frame ne/?, where ? is the full Lorentz factor.