957 resultados para Speed up
Resumo:
Diese Dissertation stellt eine Studie da, welche sich mit den Änderungen in der Governance der Hochschulbildung in Vietnam beschäftigt. Das zentrale Ziel dieser Forschungsarbeit ist die Untersuchung der Herkunft und Änderung in der Beziehung der Mächte zwischen dem vietnamesischen Staat und den Hochschulbildungsinstituten (HI), welche hauptsächlich aus der Interaktion dieser beiden Akteure resultiert. Die Macht dieser beiden Akteure wurde im sozialen Bereich konstruiert und ist hauptsächlich durch ihre Nützlichkeit und Beiträge für die Hochschulbildung bestimmt. Diese Arbeit beschäftigt sich dabei besonders mit dem Aspekt der Lehrqualität. Diese Studie nimmt dabei die Perspektive einer allgemeinen Governance ein, um die Beziehung zwischen Staat und HI zu erforschen. Zudem verwendet sie die „Resource Dependence Theory“ (RDT), um das Verhalten der HI in Bezug auf die sich verändernde Umgebung zu untersuchen, welche durch die Politik und eine abnehmende Finanzierung charakterisiert ist. Durch eine empirische Untersuchung der Regierungspolitik sowie der internen Steuerung und den Praktiken der vier führenden Universitäten kommt die Studie zu dem Schluss, dass unter Berücksichtigung des Drucks der Schaffung von Einkommen die vietnamesischen Universitäten sowohl Strategien als auch Taktiken entwickelt haben, um Ressourcenflüsse und Legitimität zu kontrollieren. Die Entscheidungs- und Zielfindung der Komitees, die aus einer Mehrheit von Akademikern bestehen, sind dabei mächtiger als die der Manager. Daher werden bei initiativen Handlungen der Universitäten größtenteils Akademiker mit einbezogen. Gestützt auf die sich entwickelnden Muster der Ressourcenbeiträge von Akademikern und Studierenden für die Hochschulbildung prognostiziert die Studie eine aufstrebende Governance Konfiguration, bei der die Dimensionen der akademischen Selbstverwaltung und des Wettbewerbsmarktes stärker werden und die Regulation des Staates rational zunimmt. Das derzeitige institutionelle Design und administrative System des Landes, die spezifische Gewichtung und die Koordinationsmechanismen, auch als sogenanntes effektives Aufsichtssystem zwischen den drei Schlüsselakteuren - der Staat, die HI/Akademiker und die Studierenden – bezeichnet, brauchen eine lange Zeit zur Detektion und Etablierung. In der aktuellen Phase der Suche nach einem solchen System sollte die Regierung Management-Tools stärken, wie zum Beispiel die Akkreditierung, belohnende und marktbasierte Instrumente und das Treffen informations-basierter Entscheidungen. Darüber hinaus ist es notwendig die Transparenz der Politik zu erhöhen und mehr Informationen offenzulegen.
Resumo:
Optimal control theory is a powerful tool for solving control problems in quantum mechanics, ranging from the control of chemical reactions to the implementation of gates in a quantum computer. Gradient-based optimization methods are able to find high fidelity controls, but require considerable numerical effort and often yield highly complex solutions. We propose here to employ a two-stage optimization scheme to significantly speed up convergence and achieve simpler controls. The control is initially parametrized using only a few free parameters, such that optimization in this pruned search space can be performed with a simplex method. The result, considered now simply as an arbitrary function on a time grid, is the starting point for further optimization with a gradient-based method that can quickly converge to high fidelities. We illustrate the success of this hybrid technique by optimizing a geometric phase gate for two superconducting transmon qubits coupled with a shared transmission line resonator, showing that a combination of Nelder-Mead simplex and Krotov’s method yields considerably better results than either one of the two methods alone.
Resumo:
Interviews with more than 40 leaders in the Boston area health care industry have identified a range of broadly-felt critical problems. This document synthesizes these problems and places them in the context of work and family issues implicit in the organization of health care workplaces. It concludes with questions about possible ways to address such issues. The defining circumstance for the health care industry nationally as well as regionally at present is an extraordinary reorganization, not yet fully negotiated, in the provision and financing of health care. Hoped-for controls on increased costs of medical care – specifically the widespread replacement of indemnity insurance by market-based managed care and business models of operation--have fallen far short of their promise. Pressures to limit expenditures have produced dispiriting conditions for the entire healthcare workforce, from technicians and aides to nurses and physicians. Under such strains, relations between managers and workers providing care are uneasy, ranging from determined efforts to maintain respectful cooperation to adversarial negotiation. Taken together, the interviews identify five key issues affecting a broad cross-section of occupational groups, albeit in different ways: Staffing shortages of various kinds throughout the health care workforce create problems for managers and workers and also for the quality of patient care. Long work hours and inflexible schedules place pressure on virtually every part of the healthcare workforce, including physicians. Degraded and unsupportive working conditions, often the result of workplace "deskilling" and "speed up," undercut previous modes of clinical practice. Lack of opportunities for training and advancement exacerbate workforce problems in an industry where occupational categories and terms of work are in a constant state of flux. Professional and employee voices are insufficiently heard in conditions of rapid institutional reorganization and consolidation. Interviewees describe multiple impacts of these issues--on the operation of health care workplaces, on the well being of the health care workforce, and on the quality of patient care. Also apparent in the interviews, but not clearly named and defined, is the impact of these issues on the ability of workers to attend well to the needs of their families--and the reciprocal impact of workers' family tensions on workplace performance. In other words, the same things that affect patient care also affect families, and vice versa. Some workers describe feeling both guilty about raising their own family issues when their patients' needs are at stake, and resentful about the exploitation of these feelings by administrators making workplace policy. The different institutions making up the health care system have responded to their most pressing issues with a variety of specific stratagems but few that address the complexities connecting relations between work and family. The MIT Workplace Center proposes a collaborative exploration of next steps to probe these complications and to identify possible locations within the health care system for workplace experimentation with outcomes benefiting all parties.
Resumo:
A key capability of data-race detectors is to determine whether one thread executes logically in parallel with another or whether the threads must operate in series. This paper provides two algorithms, one serial and one parallel, to maintain series-parallel (SP) relationships "on the fly" for fork-join multithreaded programs. The serial SP-order algorithm runs in O(1) amortized time per operation. In contrast, the previously best algorithm requires a time per operation that is proportional to Tarjan’s functional inverse of Ackermann’s function. SP-order employs an order-maintenance data structure that allows us to implement a more efficient "English-Hebrew" labeling scheme than was used in earlier race detectors, which immediately yields an improved determinacy-race detector. In particular, any fork-join program running in T₁ time on a single processor can be checked on the fly for determinacy races in O(T₁) time. Corresponding improved bounds can also be obtained for more sophisticated data-race detectors, for example, those that use locks. By combining SP-order with Feng and Leiserson’s serial SP-bags algorithm, we obtain a parallel SP-maintenance algorithm, called SP-hybrid. Suppose that a fork-join program has n threads, T₁ work, and a critical-path length of T[subscript â]. When executed on P processors, we prove that SP-hybrid runs in O((T₁/P + PT[subscript â]) lg n) expected time. To understand this bound, consider that the original program obtains linear speed-up over a 1-processor execution when P = O(T₁/T[subscript â]). In contrast, SP-hybrid obtains linear speed-up when P = O(√T₁/T[subscript â]), but the work is increased by a factor of O(lg n).
Resumo:
This paper proposes a field application of a high-level reinforcement learning (RL) control system for solving the action selection problem of an autonomous robot in cable tracking task. The learning system is characterized by using a direct policy search method for learning the internal state/action mapping. Policy only algorithms may suffer from long convergence times when dealing with real robotics. In order to speed up the process, the learning phase has been carried out in a simulated environment and, in a second step, the policy has been transferred and tested successfully on a real robot. Future steps plan to continue the learning process on-line while on the real robot while performing the mentioned task. We demonstrate its feasibility with real experiments on the underwater robot ICTINEU AUV
Resumo:
El interés de este estudio de caso es analizar el Programa Conjunto de UNFPA y UNICEF sobre MGF/E en Kenia bajo la luz de los postulados poscolonialistas. Partiendo de la idea de que la MGF es una manifestación de las desigualdades de género, se argumenta que el PC reproduce la imagen de la mujer keniana como una víctima del poder masculino. A partir de esta imagen se deslegitima el orden cultural de los grupos que siguen esta tradición, afectando las lógicas de unidad y cohesión de la sociedad. El análisis de este tipo de dinámicas permite comprender mejor los procesos de intervención de las organizaciones internacionales sobre las estructuras sociales de actores frágiles del sistema internacional.
Resumo:
In this paper we extend the reuse of paths to the shot from a moving light source. In the classical algorithm new paths have to be cast from each new position of a light source. We show that we can reuse all paths for all positions, obtaining in this way a theoretical maximum speed-up equal to the average length of the shooting path
Resumo:
La tesis doctoral "Urbanismo Ambiental y Evaluación Estratégica" investiga y propone un renovado papel del urbanismo y muy especialmente de la ordenación del territorio como instrumentos no tan sólo de valor para la protección del medio ambiente sino también para la consecución de las políticas hacia un desarrollo sostenible. Con tal fín se reflexiona y potencia la visión/función del suelo como recurso ambiental, también en su función urbanística, y de su mano las consecuencias e interacciones que tal conceptuación puede tener en la Comunidad Europea desde su competencia compartida con los Estados Miembros en materia de medio ambiente, concretamente en su capacidad para incidir, mediante técnicas normativas y no normativas en el urbanismo y la ordenación del territorio en cuanto instrumentos de regulación y ordenación del recurso ambiental suelo. En esta nueva andadura del urbanismo ambientalizado o, si se prefiere, en la carrera hacia el desarrollo urbanístico sostenible, la técnica de evaluación ambiental estratégica de determinados planes y programas se revela como fundamental, especialmente si se consigue que su transposición a los ordenamientos internos de los Estados Miembros se lleve a cabo con pleno respeto a las notas insosyalables defendidas en la tesis de técnica preventiva y no reparativa, dinámica, con capacidad de incidencia en la elección misma del modelo de plan y con miras finalistas a la sostenibilidad. Con ella podrá generalizarse el urbanismo ambiental y acelerarse el proceso de revisión que tienen pendiente las materias de urbanismo y ordenación del territorio desde la nueva concepción del suelo como recurso ambiental y mediante la recepción e integración de los principios ambientalistas.
Resumo:
La present Tesi Doctoral, titulada desenvolupament computacional de la semblança molecular quàntica, tracta, fonamentalment, els aspectes de càlcul de mesures de semblança basades en la comparació de funcions de densitat electrònica.El primer capítol, Semblança quàntica, és introductori. S'hi descriuen les funcions de densitat de probabilitat electrònica i llur significança en el marc de la mecànica quàntica. Se n'expliciten els aspectes essencials i les condicions matemàtiques a satisfer, cara a una millor comprensió dels models de densitat electrònica que es proposen. Hom presenta les densitats electròniques, mencionant els teoremes de Hohenberg i Kohn i esquematitzant la teoria de Bader, com magnituds fonamentals en la descripció de les molècules i en la comprensió de llurs propietats.En el capítol Models de densitats electròniques moleculars es presenten procediments computacionals originals per l'ajust de funcions densitat a models expandits en termes de gaussianes 1s centrades en els nuclis. Les restriccions físico-matemàtiques associades a les distribucions de probabilitat s'introdueixen de manera rigorosa, en el procediment anomenat Atomic Shell Approximation (ASA). Aquest procediment, implementat en el programa ASAC, parteix d'un espai funcional quasi complert, d'on se seleccionen variacionalment les funcions o capes de l'expansió, d'acord als requisits de no negativitat. La qualitat d'aquestes densitats i de les mesures de semblança derivades es verifica abastament. Aquest model ASA s'estén a representacions dinàmiques, físicament més acurades, en quant que afectades per les vibracions nuclears, cara a una exploració de l'efecte de l'esmorteïment dels pics nuclears en les mesures de semblança molecular. La comparació de les densitats dinàmiques respecte les estàtiques evidencia un reordenament en les densitats dinàmiques, d'acord al que constituiria una manifestació del Principi quàntic de Le Chatelier. El procediment ASA, explícitament consistent amb les condicions de N-representabilitat, s'aplica també a la determinació directe de densitats electròniques hidrogenoides, en un context de teoria del funcional de la densitat.El capítol Maximització global de la funció de semblança presenta algorismes originals per la determinació de la màxima sobreposició de les densitats electròniques moleculars. Les mesures de semblança molecular quàntica s'identifiquen amb el màxim solapament, de manera es mesuri la distància entre les molècules, independentment dels sistemes de referència on es defineixen les densitats electròniques. Partint de la solució global en el límit de densitats infinitament compactades en els nuclis, es proposen tres nivells de aproximació per l'exploració sistemàtica, no estocàstica, de la funció de semblança, possibilitant la identificació eficient del màxim global, així com també dels diferents màxims locals. Es proposa també una parametrització original de les integrals de recobriment a través d'ajustos a funcions lorentzianes, en quant que tècnica d'acceleració computacional. En la pràctica de les relacions estructura-activitat, aquests avenços possibiliten la implementació eficient de mesures de semblança quantitatives, i, paral·lelament, proporcionen una metodologia totalment automàtica d'alineació molecular. El capítol Semblances d'àtoms en molècules descriu un algorisme de comparació dels àtoms de Bader, o regions tridimensionals delimitades per superfícies de flux zero de la funció de densitat electrònica. El caràcter quantitatiu d'aquestes semblances possibilita la mesura rigorosa de la noció química de transferibilitat d'àtoms i grups funcionals. Les superfícies de flux zero i els algorismes d'integració usats han estat publicats recentment i constitueixen l'aproximació més acurada pel càlcul de les propietats atòmiques. Finalment, en el capítol Semblances en estructures cristal·lines hom proposa una definició original de semblança, específica per la comparació dels conceptes de suavitat o softness en la distribució de fonons associats a l'estructura cristal·lina. Aquests conceptes apareixen en estudis de superconductivitat a causa de la influència de les interaccions electró-fonó en les temperatures de transició a l'estat superconductor. En aplicar-se aquesta metodologia a l'anàlisi de sals de BEDT-TTF, s'evidencien correlacions estructurals entre sals superconductores i no superconductores, en consonància amb les hipòtesis apuntades a la literatura sobre la rellevància de determinades interaccions.Conclouen aquesta tesi un apèndix que conté el programa ASAC, implementació de l'algorisme ASA, i un capítol final amb referències bibliogràfiques.
Resumo:
El treball desenvolupat en aquesta tesi aprofundeix i aporta solucions innovadores en el camp orientat a tractar el problema de la correspondència en imatges subaquàtiques. En aquests entorns, el que realment complica les tasques de processat és la falta de contorns ben definits per culpa d'imatges esborronades; un fet aquest que es deu fonamentalment a il·luminació deficient o a la manca d'uniformitat dels sistemes d'il·luminació artificials. Els objectius aconseguits en aquesta tesi es poden remarcar en dues grans direccions. Per millorar l'algorisme d'estimació de moviment es va proposar un nou mètode que introdueix paràmetres de textura per rebutjar falses correspondències entre parells d'imatges. Un seguit d'assaigs efectuats en imatges submarines reals han estat portats a terme per seleccionar les estratègies més adients. Amb la finalitat d'aconseguir resultats en temps real, es proposa una innovadora arquitectura VLSI per la implementació d'algunes parts de l'algorisme d'estimació de moviment amb alt cost computacional.
Resumo:
We demonstrate that it is possible to link multi-chain molecular dynamics simulations with the tube model using a single chain slip-links model as a bridge. This hierarchical approach allows significant speed up of simulations, permitting us to span the time scales relevant for a comparison with the tube theory. Fitting the mean-square displacement of individual monomers in molecular dynamics simulations with the slip-spring model, we show that it is possible to predict the stress relaxation. Then, we analyze the stress relaxation from slip-spring simulations in the framework of the tube theory. In the absence of constraint release, we establish that the relaxation modulus can be decomposed as the sum of contributions from fast and longitudinal Rouse modes, and tube survival. Finally, we discuss some open questions regarding possible future directions that could be profitable in rendering the tube model quantitative, even for mildly entangled polymers
Resumo:
The stratospheric sudden warming in the Southern Hemisphere (SH) in September 2002 was unexpected for two reasons. First, planetary wave activity in the Southern Hemisphere is very weak, and midwinter warmings have never been observed, at least not since observations of the upper stratosphere became regularly available. Second, the warming occurred in a west phase of the quasi-biennial oscillation (QBO) in the lower stratosphere. This is unexpected because warmings are usually considered to be more likely in the east phase of the QBO, when a zero wind line is present in the winter subtropics and hence confines planetary wave propagation to higher latitudes closer to the polar vortex. At first, this evidence suggests that the sudden warming must therefore be simply a result of anomalously strong planetary wave forcing from the troposphere. However, recent model studies have suggested that the midwinter polar vortex may also be sensitive to the equatorial winds in the upper stratosphere, the region dominated by the semiannual oscillation. In this paper, the time series of equatorial zonal winds from two different data sources, the 40-yr ECMWF Re-Analysis (ERA) and the Met Office assimilated dataset, are reviewed. Both suggest that the equatorial winds in the upper stratosphere above 10 hPa were anomalously easterly in 2002. Idealized model experiments are described in which the modeled equatorial winds were relaxed toward these observations for various years to examine whether the anomalous easterlies in 2002 could influence the timing of a warming event. It is found that the 2002 equatorial winds speed up the evolution of a warming event in the model. Therefore, this study suggests that the anomalous easterlies in the 1–10-hPa region may have been a contributory factor in the development of the observed SH warming. However, it is concluded that it is unlikely that the anomalous equatorial winds alone can explain the 2002 warming event.
Resumo:
Visual exploration of scientific data in life science area is a growing research field due to the large amount of available data. The Kohonen’s Self Organizing Map (SOM) is a widely used tool for visualization of multidimensional data. In this paper we present a fast learning algorithm for SOMs that uses a simulated annealing method to adapt the learning parameters. The algorithm has been adopted in a data analysis framework for the generation of similarity maps. Such maps provide an effective tool for the visual exploration of large and multi-dimensional input spaces. The approach has been applied to data generated during the High Throughput Screening of molecular compounds; the generated maps allow a visual exploration of molecules with similar topological properties. The experimental analysis on real world data from the National Cancer Institute shows the speed up of the proposed SOM training process in comparison to a traditional approach. The resulting visual landscape groups molecules with similar chemical properties in densely connected regions.
Resumo:
Self-Organizing Map (SOM) algorithm has been extensively used for analysis and classification problems. For this kind of problems, datasets become more and more large and it is necessary to speed up the SOM learning. In this paper we present an application of the Simulated Annealing (SA) procedure to the SOM learning algorithm. The goal of the algorithm is to obtain fast learning and better performance in terms of matching of input data and regularity of the obtained map. An advantage of the proposed technique is that it preserves the simplicity of the basic algorithm. Several tests, carried out on different large datasets, demonstrate the effectiveness of the proposed algorithm in comparison with the original SOM and with some of its modification introduced to speed-up the learning.
Resumo:
A study was conducted in the forest-steppe region of the Loess Plateau to provide insight into the factors affecting the process of vegetation establishment, and to provide recommendations for the selection of indigenous species in order to speed up the succession process and to allow the establishment of vegetation more resistant to soil erosion. Four distinctive vegetation types were identified, and their distribution was affected not only by the time since abandonment but also by other environmental factors, mainly soil water and total P in the upper soil layers. One of the vegetation types, dominated by Artemisia scoparia, formed the early successional stage after abandonment while the other three types formed later successional stages with their distribution determined by the soil water content and total P. It can be concluded that the selection of appropriate species for introduction to accelerate succession should be determined by the local conditions and especially the total P concentration and soil water content.