510 resultados para Guil unification


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The discussions wherein develop proposals for university reform in Brazil include, among other things, the conception of the university titled "New University", whose structural origin comes from the bill of higher education reform and unification of the foundations of education European upper (Bologna process). At its core, the Bologna process has imposed a series of transformations, among which, the promotion of mobility, as a stimulus to interinstitutional cooperation to enable an better and bigger qualification of the students. Nevertheless, what we see is that this point is one of the main points made flawed by Brazilian institutions that have adopted this model of higher education. An example is the Bachelor of Science and Technology - BC&T, Federal University of Rio Grande do Norte - UFRN, where there are problems of the internal order, represented by the problem of the reusing of the disciplines, such also of external order, in cases of transfers interinstitutional. Because of this, and knowing that this is a typical problem in which multiple criteria are involved, the aim of this study is to propose a multicriteria model for selection of interciclo of the BC&T of the UFRN which addresses the issue of mobility. For this, this study was of exploratory and study case nature, use as tools of data collection, the bibliographic and documentary research, as well as semi-structured interviews. For the elaboration of the model, were used the five phases most commonly used in the modeling of problems in operational research in a sample of 91 students of BC&T. As a result, we obtained a model that addresses the issue of internal and external mobility of the school and that, moreover, was also more robust and fair than the current model of BC&T and also what is used in other courses of the UFRN, taking into consideration the expected results by the decision makers

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the context of a renormalizable supersymmetric SO(10) Grand Unified Theory, we consider the fermion mass matrices generated by the Yukawa couplings to a 10 circle plus 120 circle plus (126) over bar representation of scalars. We perform a complete investigation of the possibilities of imposing flavour symmetries in this scenario; the purpose is to reduce the number of Yukawa coupling constants in order to identify potentially predictive models. We have found that there are only 14 inequivalent cases of Yukawa coupling matrices, out of which 13 cases are generated by 74 symmetries, with suitable n, and one case is generated by a Z(2) x Z(2) symmetry. A numerical analysis of the 14 cases reveals that only two of them-dubbed A and B in the present paper allow good fits to the experimentally known fermion masses and mixings. (C) 2016 The Authors. Published by Elsevier B.V.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Proceso y dispositivo de ozonización para el tratamiento de aguas contaminadas con pesticidas y otros contaminantes orgánicos de origen agrícola en el que un volumen de agua se introduce en un reactor, haciéndose burbujear en el mismo una mezcla de aire y ozono a través del agua. En la fase acuosa se mantienen determinadas condiciones de pH y concentración de agua oxigenada de forma automatizada, generándose una concentración alta de especies radicales que producen la oxidación de los contaminantes. De esta forma se consigue la eliminación de los contaminantes originales y de los productos intermedios, llegándose a la mineralización total o reducción a moléculas orgánicas pequeñas muy oxidadas -ácido oxálico, ácido fórmico- de baja peligrosidad, que permite la reutilización o vertido posterior de las aguas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Procedimiento y dispositivo para el tratamiento integrado de residuos vegetales procedentes de explotaciones agrícolas y de las aguas residuales originadas. Procedimiento y dispositivo para el tratamiento de residuos vegetales en el que un volumen de residuos se somete a un proceso que integra las siguientes operaciones: (i) cortado mediante cizalla, (ii) homogeneizado mediante vibración, (iii) lavado con surfactantes en cilindro rotatorio, (iv) secado en cinta continua mediante chorro de aire caliente, y (v) secado opcional en tanque giratorio provisto de chorro de aire caliente, en caso de ser necesario reducir la humedad, originándose finalmente un residuo estable, de menor volumen que el inicial apto para uso industrial, y una corriente de aguas de lavado con una elevada concentración de pesticidas, microorganismos, surfactantes y partículas. La corriente de aguas de lavado se filtran y se llevan a un tanque-reactor que por tratamiento oxidativo con ozono y agua oxigenada mineraliza loscontaminantes presentes permitiendo su reutilización.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Procedimiento para la purificación de triglicéridos que contienen ácido gamma-linolénico en posición sn-2. Con la finalidad de purificar triglicéridos que contienen ácido gamma-linolénico a partir de fuentes naturales, se utiliza una columna cromatográfica gravimétrica en fase normal, trabajando en gradiente de polaridad con solventes biocompatibles. Así se consigue la purificación de triglicéridos que cuentan en su estructura con una o más moléculas de ácido gamma-linolénico, pudiendo ser utilizados con diversos fines. Con esta metodología es posible trabajar a escala industrial, pues es fácilmente escalable, a diferencia de otras técnicas que son aplicables a escala analítica pero presentan serios inconvenientes en cuanto a coste y adiestramiento del personal a la hora de utilizarlas con fines industriales, como por ejemplo la cromatografía líquida de alta resolución (HPLC).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Procedimiento para la purificación de triglicéridos que contiene ácido estearidónico en posición sn-2. La invención se refiere a un procedimiento para la purificación de TGs ricos en SDA en posición sn-2 mediante cromatografía en columna gravimétrica, a un extracto de TGs ricos en SDA en posición sn-2 obtenido mediante dicho procedimiento y su uso en la industria.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper analyzes the development of the first experiences of the sixteenth century cultural policy in Italy until the beginning of the twenty-first century with the institutional reform initiated by the Minister Dario Franceschini. In the pre-unification State it has been many important contributions of several local rulers who imposed conservation policies to prevent the dispersal of works of art. After the unification of Italy (1861) the laws of protection of the national heritage have helped to initiate the first important initiatives that have developed in practice only at the end of the twentieth century. Great institutional innovations and regulatory activated in the twenty-first century and of which this paper provides some important insights and deepening.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dynamization of integration processes in Europe has generated numerous research topics for political analysis. Border integration is an expression of the broader unification processes of certain structures. It is also a manifestation of the observation that people think globally, but function locally. The European integration perspective is therefore practically implemented in micro structures, exemplified by border twin towns. The objective of this paper is to revive the micro perspective as a useful approach in the investigation of integration processes. This perspective is applied in the field of border studies, which focus on research into the transformation of European borders resulting from integration processes, as well as on the transformations of the concepts of statehood, territoriality and sovereignty. It is assumed that these phenomena are definitely more observable at the outskirts of states than in their centers. Theoretical and empirical considerations are based on the example of border twin towns, as the European units of local government that integrate across borders. The main differences between the integration of towns in Western Europe and Central and Eastern Europe are also indicated in the analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research into borders and frontiers in the context of European integration has evolved, leading to the question of the shape of the external borders of the EU and their organization in relation to the external surroundings. The approach to how the unification processes of the continent are presented has recently changed, and the Union is being perceived through its peripheries. The one-way model of the flow of ideas from the center to the peripheries has been replaced by a two-way one. This allows us to use the Westfalen, imperial and neo-medieval geopolitical model to analyze the EU and, consequently, the four geo-strategies that are regionally diversified in the northern, eastern and southern peripheries of the Union. Nevertheless, it is the periphery that plays the key role and initiates certain types of relations with neighbors, whereas the center approves of them and modifies them, according to its own requirements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Observing system experiments (OSEs) are carried out over a 1-year period to quantify the impact of Argo observations on the Mercator Ocean 0.25° global ocean analysis and forecasting system. The reference simulation assimilates sea surface temperature (SST), SSALTO/DUACS (Segment Sol multi-missions dALTimetrie, d'orbitographie et de localisation précise/Data unification and Altimeter combination system) altimeter data and Argo and other in situ observations from the Coriolis data center. Two other simulations are carried out where all Argo and half of the Argo data are withheld. Assimilating Argo observations has a significant impact on analyzed and forecast temperature and salinity fields at different depths. Without Argo data assimilation, large errors occur in analyzed fields as estimated from the differences when compared with in situ observations. For example, in the 0–300 m layer RMS (root mean square) differences between analyzed fields and observations reach 0.25 psu and 1.25 °C in the western boundary currents and 0.1 psu and 0.75 °C in the open ocean. The impact of the Argo data in reducing observation–model forecast differences is also significant from the surface down to a depth of 2000 m. Differences between in situ observations and forecast fields are thus reduced by 20 % in the upper layers and by up to 40 % at a depth of 2000 m when Argo data are assimilated. At depth, the most impacted regions in the global ocean are the Mediterranean outflow, the Gulf Stream region and the Labrador Sea. A significant degradation can be observed when only half of the data are assimilated. Therefore, Argo observations matter to constrain the model solution, even for an eddy-permitting model configuration. The impact of the Argo floats' data assimilation on other model variables is briefly assessed: the improvement of the fit to Argo profiles do not lead globally to unphysical corrections on the sea surface temperature and sea surface height. The main conclusion is that the performance of the Mercator Ocean 0.25° global data assimilation system is heavily dependent on the availability of Argo data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objetivos: Listar las variables imprescindibles en los cuadros de mando integrales para abarcar todas las áreas básicas de trabajo en una Unidad de Radiofarmacia hospitalaria, cuya adecuada gestión puede ser clave para optimizar los recursos disponibles. En segundo lugar, enumerar los beneficios que redundan en la práctica de trabajo diario tras su integración. Métodos: Revisión de la bibliografía publicada sobre cuadros de mando integrales, seleccionando las variables para que el radiofarmacéutico asuma un papel activo en la mejora de su área de trabajo. Se utilizan programas construídos en Microsoft Access para la gestión integral. Se administran en varios módulos toda la información desde la prescripción y citación de los pacientes (asignándoles un código) hasta que se les realiza la exploración. Se recogen también variables como: fecha y hora límites de tramitación de radiofármaco al proveedor; fecha prueba médica; gestión de eluciones de generadores y kits fríos; turnos de trabajo del personal; registro de incidencias tipificadas y de datos de recepción, marcaje, control de calidad y dispensación de cada radiofármaco (asegurando la trazabilidad); detección de desviaciones entre actividad calibrada y medida; la actividad dispensada y la disponible a tiempo real; gestión de eliminación de residuos radiactivos, de existencias y caducidades; fechas de próximas revisiones de equipos; archivado de PNT; sistemas conversores de unidades y registro de informes clínicos. Resultados: Los programas especializados gestionan la información que se maneja en la Unidad de Radiofarmacia, facilitando tomar decisiones coste-efectivas. Los parámetros analizados son: número de preparaciones elaboradas y actividad manejada; posibles incidencias en cualquiera de los procesos cotidianos; porcentaje de resolución satisfactoria sin que derive en falta de disponibilidad; correcta trazabilidad de los radiofármacos; porcentaje de controles de calidad satisfactorios; evolución en el consumo por tipo de radiofármaco, etc. La mejora en la gestión de pedidos asegura la presencia del radiofármaco necesario para cada exploración. Conclusiones: Estos nuevos cuadros de mando integrales son útiles para optimizar pedidos y radiofármacos, asegurar trazabilidad, gestionar inventario, informes clínicos, residuos radiactivos y para evaluar la eficiencia de la Unidad de radiofarmacia, permitiendo la integración de estos datos con otros softwares de gestión sanitaria. Esta metodología puede aplicarse en Centros Sanitarios de Atención Primaria para enfocar al personal en sus funciones asistenciales y operativas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Humans use their grammatical knowledge in more than one way. On one hand, they use it to understand what others say. On the other hand, they use it to say what they want to convey to others (or to themselves). In either case, they need to assemble the structure of sentences in a systematic fashion, in accordance with the grammar of their language. Despite the fact that the structures that comprehenders and speakers assemble are systematic in an identical fashion (i.e., obey the same grammatical constraints), the two ‘modes’ of assembling sentence structures might or might not be performed by the same cognitive mechanisms. Currently, the field of psycholinguistics implicitly adopts the position that they are supported by different cognitive mechanisms, as evident from the fact that most psycholinguistic models seek to explain either comprehension or production phenomena. The potential existence of two independent cognitive systems underlying linguistic performance doubles the problem of linking the theory of linguistic knowledge and the theory of linguistic performance, making the integration of linguistics and psycholinguistic harder. This thesis thus aims to unify the structure building system in comprehension, i.e., parser, and the structure building system in production, i.e., generator, into one, so that the linking theory between knowledge and performance can also be unified into one. I will discuss and unify both existing and new data pertaining to how structures are assembled in understanding and speaking, and attempt to show that the unification between parsing and generation is at least a plausible research enterprise. In Chapter 1, I will discuss the previous and current views on how parsing and generation are related to each other. I will outline the challenges for the current view that the parser and the generator are the same cognitive mechanism. This single system view is discussed and evaluated in the rest of the chapters. In Chapter 2, I will present new experimental evidence suggesting that the grain size of the pre-compiled structural units (henceforth simply structural units) is rather small, contrary to some models of sentence production. In particular, I will show that the internal structure of the verb phrase in a ditransitive sentence (e.g., The chef is donating the book to the monk) is not specified at the onset of speech, but is specified before the first internal argument (the book) needs to be uttered. I will also show that this timing of structural processes with respect to the verb phrase structure is earlier than the lexical processes of verb internal arguments. These two results in concert show that the size of structure building units in sentence production is rather small, contrary to some models of sentence production, yet structural processes still precede lexical processes. I argue that this view of generation resembles the widely accepted model of parsing that utilizes both top-down and bottom-up structure building procedures. In Chapter 3, I will present new experimental evidence suggesting that the structural representation strongly constrains the subsequent lexical processes. In particular, I will show that conceptually similar lexical items interfere with each other only when they share the same syntactic category in sentence production. The mechanism that I call syntactic gating, will be proposed, and this mechanism characterizes how the structural and lexical processes interact in generation. I will present two Event Related Potential (ERP) experiments that show that the lexical retrieval in (predictive) comprehension is also constrained by syntactic categories. I will argue that the syntactic gating mechanism is operative both in parsing and generation, and that the interaction between structural and lexical processes in both parsing and generation can be characterized in the same fashion. In Chapter 4, I will present a series of experiments examining the timing at which verbs’ lexical representations are planned in sentence production. It will be shown that verbs are planned before the articulation of their internal arguments, regardless of the target language (Japanese or English) and regardless of the sentence type (active object-initial sentence in Japanese, passive sentences in English, and unaccusative sentences in English). I will discuss how this result sheds light on the notion of incrementality in generation. In Chapter 5, I will synthesize the experimental findings presented in this thesis and in previous research to address the challenges to the single system view I outlined in Chapter 1. I will then conclude by presenting a preliminary single system model that can potentially capture both the key sentence comprehension and sentence production data without assuming distinct mechanisms for each.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The performance, energy efficiency and cost improvements due to traditional technology scaling have begun to slow down and present diminishing returns. Underlying reasons for this trend include fundamental physical limits of transistor scaling, the growing significance of quantum effects as transistors shrink, and a growing mismatch between transistors and interconnects regarding size, speed and power. Continued Moore's Law scaling will not come from technology scaling alone, and must involve improvements to design tools and development of new disruptive technologies such as 3D integration. 3D integration presents potential improvements to interconnect power and delay by translating the routing problem into a third dimension, and facilitates transistor density scaling independent of technology node. Furthermore, 3D IC technology opens up a new architectural design space of heterogeneously-integrated high-bandwidth CPUs. Vertical integration promises to provide the CPU architectures of the future by integrating high performance processors with on-chip high-bandwidth memory systems and highly connected network-on-chip structures. Such techniques can overcome the well-known CPU performance bottlenecks referred to as memory and communication wall. However the promising improvements to performance and energy efficiency offered by 3D CPUs does not come without cost, both in the financial investments to develop the technology, and the increased complexity of design. Two main limitations to 3D IC technology have been heat removal and TSV reliability. Transistor stacking creates increases in power density, current density and thermal resistance in air cooled packages. Furthermore the technology introduces vertical through silicon vias (TSVs) that create new points of failure in the chip and require development of new BEOL technologies. Although these issues can be controlled to some extent using thermal-reliability aware physical and architectural 3D design techniques, high performance embedded cooling schemes, such as micro-fluidic (MF) cooling, are fundamentally necessary to unlock the true potential of 3D ICs. A new paradigm is being put forth which integrates the computational, electrical, physical, thermal and reliability views of a system. The unification of these diverse aspects of integrated circuits is called Co-Design. Independent design and optimization of each aspect leads to sub-optimal designs due to a lack of understanding of cross-domain interactions and their impacts on the feasibility region of the architectural design space. Co-Design enables optimization across layers with a multi-domain view and thus unlocks new high-performance and energy efficient configurations. Although the co-design paradigm is becoming increasingly necessary in all fields of IC design, it is even more critical in 3D ICs where, as we show, the inter-layer coupling and higher degree of connectivity between components exacerbates the interdependence between architectural parameters, physical design parameters and the multitude of metrics of interest to the designer (i.e. power, performance, temperature and reliability). In this dissertation we present a framework for multi-domain co-simulation and co-optimization of 3D CPU architectures with both air and MF cooling solutions. Finally we propose an approach for design space exploration and modeling within the new Co-Design paradigm, and discuss the possible avenues for improvement of this work in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The discussions wherein develop proposals for university reform in Brazil include, among other things, the conception of the university titled "New University", whose structural origin comes from the bill of higher education reform and unification of the foundations of education European upper (Bologna process). At its core, the Bologna process has imposed a series of transformations, among which, the promotion of mobility, as a stimulus to interinstitutional cooperation to enable an better and bigger qualification of the students. Nevertheless, what we see is that this point is one of the main points made flawed by Brazilian institutions that have adopted this model of higher education. An example is the Bachelor of Science and Technology - BC&T, Federal University of Rio Grande do Norte - UFRN, where there are problems of the internal order, represented by the problem of the reusing of the disciplines, such also of external order, in cases of transfers interinstitutional. Because of this, and knowing that this is a typical problem in which multiple criteria are involved, the aim of this study is to propose a multicriteria model for selection of interciclo of the BC&T of the UFRN which addresses the issue of mobility. For this, this study was of exploratory and study case nature, use as tools of data collection, the bibliographic and documentary research, as well as semi-structured interviews. For the elaboration of the model, were used the five phases most commonly used in the modeling of problems in operational research in a sample of 91 students of BC&T. As a result, we obtained a model that addresses the issue of internal and external mobility of the school and that, moreover, was also more robust and fair than the current model of BC&T and also what is used in other courses of the UFRN, taking into consideration the expected results by the decision makers

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With interest we read the article by Khosroshahi et al. about a novel method for quantification of left ventricular hypertrabeculation/noncompaction (LVHT) using two-dimensional echocardiography in children (1). We appreciate their efforts to contribute to an improvement and unification of echocardiographic diagnostic criteria for LVHT, which is urgently needed. Concerning their proposed method, we have the following questions and concerns: