820 resultados para Aleph Analytics
Resumo:
En las últimas décadas hemos asistido a un renovado interés por la temprana recepción argentina de las categorías analíticas de Antonio Gramsci. Compartiendo ese interés, el presente trabajo se propone analizar las novedades introducidas por la lectura gramsciana de la Reforma Universitaria que propone Juan Carlos Portantiero en la década del setenta. Para ello, el trabajo comienza por ubicar las tesis de Estudiantes y política en América Latina. El proceso de la reforma universitaria (1918-1938) en el marco de las interpretaciones sobre la Reforma, para luego explicitar el proyecto político-intelectual en el que participa Portantiero en el momento de elaboración de su lectura, y concentrarse en las principales tesis del autor
Resumo:
Tanto el modelo de intelectual que se observa aquí, como las estéticas aludidas, corresponden, o tienen su auge, aparentemente, en una época anterior En 1942, la Comisión Nacional de Cultura otorga el Premio Nacional de Literatura a Eduardo Acevedo Díaz por Cancha larga. Este acontecimiento tuvo gran resonancia, tanto que el número 94 de la revista Sur estuvo dedicado al desagraviar a Borges, al no haberse premiado El jardín de senderos que se bifurcan. Poco tiempo después, aparece en el número 76 de Nosotros. Segunda Época un artículo sin firma, titulado "Los premios nacionales de literatura", en el que se justifica la decisión. En 1945, Borges publicará en Sur "El Aleph", donde parodia la situación bajo el título de 'Postdata del primero de marzo de 1943'. El análisis de esta polémica permite dar cuenta de un momento crucial, en el que comienzan a reestructurarse el campo intelectual y el canon en la literatura argentina. Se desplaza al nacionalismo de su lugar hegemónico y comienza a concebirse la "escritura nacional" de un modo menos mimético. De este esta manera, el nativismo regionalista, ligado a aquella tendencia, pasa a ocupar un espacio muy marginal, cercano al olvido, en la historia literaria, pese al lugar importante que ocupó en las primeras décadas del siglo XX.
Resumo:
The Tara Oceans Expedition (2009-2013) sampled the world oceans on board a 36 m long schooner, collecting environmental data and organisms from viruses to planktonic metazoans for later analyses using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present data set provides continuous measurements of partial pressure of carbon dioxide (pCO2), using a ProOceanus CO2-Pro instrument mounted on the flowthrough system. This automatic sensor is fitted with an equilibrator made of gas permeable silicone membrane and an internal detection loop with a non-dispersive infrared detector of PPSystems SBA-4 CO2 analyzer. A zero-CO2 baseline is provided for the subsequent measurements circulating the internal gas through a CO2 absorption chamber containing soda lime or Ascarite. The frequency of this automatic zero point calibration was set to be 24 hours. All data recorded during zeroing processes were discarded with the 15-minute data after each calibration. The output of CO2-Pro is the mole fraction of CO2 in the measured water and the pCO2 is obtained using the measured total pressure of the internal wet gas. The fugacity of CO2 (fCO2) in the surface seawater, whose difference with the atmospheric CO2 fugacity is proportional to the air-sea CO2 fluxes, is obtained by correcting the pCO2 for non-ideal CO2 gas concentration according to Weiss (1974). The fCO2 computed using CO2-Pro measurements was corrected to the sea surface condition by considering the temperature effect on fCO2 (Takahashi et al., 1993). The surface seawater observations that were initially estimated with a 15 seconds frequency were averaged every 5-min cycle. The performance of CO2-Pro was adjusted by comparing the sensor outputs against the thermodynamic carbonate calculation of pCO2 using the carbonic system constants of Millero et al. (2006) from the determinations of total inorganic carbon (CT ) and total alkalinity (AT ) in discrete samples collected at sea surface. AT was determined using an automated open cell potentiometric titration (Haraldsson et al. 1997). CT was determined with an automated coulometric titration (Johnson et al. 1985; 1987), using the MIDSOMMA system (Mintrop, 2005). fCO2 data are flagged according to the WOCE guidelines following Pierrot et al. (2009) identifying recommended values and questionable measurements giving additional information about the reasons of the questionability.
Resumo:
Alexander von Humboldt veröffentlichte rund 800 Aufsätze, Artikel und Essays in zahlreichen Wissensgebieten und diversen Sprachen. Die Verteilung ihrer Publikati-onsorte entspricht der Reichweite seiner Reisen und der globalen Perspektive seiner Forschungen. Die Vielfalt seiner Co-Autoren und Kooperationspartner spiegelt seine Multidisziplinarität. Humboldts umfangreiches publizistisches Werk dokumentiert sei-ne internationale Bedeutung als Wissenschaftler, Reiseschriftsteller und Kulturver-mittler. An der Universität Bern entsteht die erste Gesamtedition dieses Humboldt-schen Œuvres, die 2019 zum 250. Geburtstag des Autors vorliegen soll. Ihr Ziel ist Systematisierung, Dokumentation und Erschließung des Corpus – in einer Buchausgabe mit Text- und Apparatbänden und in einer digitalen Edition mit computerphilo-logischen Werkzeugen. Alexander von Humboldt (1769–1859), Aufsätze und Essays, internationale Publizistik, Wissenschaftsgeschichte, Editionsphilologie.
Resumo:
This paper describes the potential impact of social media and new technologies in secondary education. The case of study has been designed for the drama and theatre subject. A wide set of tools like social networks, blogs, internet, multimedia content, local press and other promotional tools are promoted to increase students’ motivation. The experiment was developed at the highschool IES Al-Satt located in Algete in the Comunidad de Madrid. The students included in the theatre group present a low academic level, 80% of them had previously repeated at least one grade, half of them come from programs for students with learning difficulties and were at risk of social exclusion. This action is supported by higher and secondary education professors and teachers who look forward to implanting networked media technologies as new tools to improve the academic results and the degree of involvement of students. The results of the experiment have been excellent, based on satisfactory opinions obtained from a survey answered by students at the end of the course, and also revealed by the analytics taken from different social networks. This project is a pioneer in the introduction and usage of new technologies in secondary high-schools in Spain.
Resumo:
Fission product yields are fundamental parameters for several nuclear engineering calculations and in particular for burn-up/activation problems. The impact of their uncertainties was widely studied in the past and valuations were released, although still incomplete. Recently, the nuclear community expressed the need for full fission yield covariance matrices to produce inventory calculation results that take into account the complete uncertainty data. In this work, we studied and applied a Bayesian/generalised least-squares method for covariance generation, and compared the generated uncertainties to the original data stored in the JEFF-3.1.2 library. Then, we focused on the effect of fission yield covariance information on fission pulse decay heat results for thermal fission of 235U. Calculations were carried out using different codes (ACAB and ALEPH-2) after introducing the new covariance values. Results were compared with those obtained with the uncertainty data currently provided by the library. The uncertainty quantification was performed with the Monte Carlo sampling technique. Indeed, correlations between fission yields strongly affect the statistics of decay heat. Introduction Nowadays, any engineering calculation performed in the nuclear field should be accompanied by an uncertainty analysis. In such an analysis, different sources of uncertainties are taken into account. Works such as those performed under the UAM project (Ivanov, et al., 2013) treat nuclear data as a source of uncertainty, in particular cross-section data for which uncertainties given in the form of covariance matrices are already provided in the major nuclear data libraries. Meanwhile, fission yield uncertainties were often neglected or treated shallowly, because their effects were considered of second order compared to cross-sections (Garcia-Herranz, et al., 2010). However, the Working Party on International Nuclear Data Evaluation Co-operation (WPEC)
Resumo:
La energía basada en turbinas hidráulicas de reducida potencia es, a menudo, un ejemplo- modelo dentro del campo de las energías renovables desde su aparición a finales del S. XIX, aunque los ingenios precursores surgen mucho antes. Entre los testimonios más antiguos destacan la saqia o rueda persa y la rueda hidráulica romana que había sido previamente implementada en Extremo Oriente, y que después llega a Europa a través de Egipto. Más tarde, durante la Edad Media y el Renacimiento, se generaliza el uso de los molinos hidráulicos, además de los eólicos. Ejemplos de ello son las norias de Alepo (Siria) y de Córdoba (España). Otro caso interesante es el de los molinos de regolfo en la Península Ibérica e Iberoamérica, muy cercanos en su forma y fundamentos a las turbinas hidráulicas. Algunos de estos ingenios siguen activos en los ríos de España. Posteriormente los estudios de Euler, Burdin y Fourneyron prepararon las bases para el definitivo avance de Pelton, Kaplan, Francis, y otros, pero ya como máquinas motrices de los generadores eléctricos. En la actualidad, se admite como límite superior para minihidráulica aquellas centrales con una potencia instalada de 5000 kW, y considerando que cuando las potencias son inferiores a 100 kW se denomina micro hidráulica, aunque en Latinoamérica este límite se fija en 20 kW. El estudio del recurso hídrico, ayudado del geotécnico, constituyen la base sobre la que podremos proyectar el aprovechamiento hidroeléctrico: selección del tipo de central dentro de la tipología disponible, diseño y cálculos de la turbina, obra civil necesaria (azud, presa, canal, tubería forzada, edificio, aspiración, desagüe, etc.) y equipo electromecánico. El proyecto tecnológico se complementa con el estudio de impacto ambiental y de viabilidad económica. Muchos de estos proyectos tratan de reducir la falta de acceso a la energía en poblaciones desfavorecidas, entendida esta carencia como un factor determinante de la pobreza. Así la energía mini y micro-hidráulica adquiere un nuevo valor como tecnología para el desarrollo humano.
Resumo:
Secure access to patient data is becoming of increasing importance, as medical informatics grows in significance, to both assist with population health studies, and patient specific medicine in support of treatment. However, assembling the many different types of data emanating from the clinic is in itself a difficulty, and doing so across national borders compounds the problem. In this paper we present our solution: an easy to use distributed informatics platform embedding a state of the art data warehouse incorporating a secure pseudonymisation system protecting access to personal healthcare data. Using this system, a whole range of patient derived data, from genomics to imaging to clinical records, can be assembled and linked, and then connected with analytics tools that help us to understand the data. Research performed in this environment will have immediate clinical impact for personalised patient healthcare.
Resumo:
Video analytics play a critical role in most recent traffic monitoring and driver assistance systems. In this context, the correct detection and classification of surrounding vehicles through image analysis has been the focus of extensive research in the last years. Most of the pieces of work reported for image-based vehicle verification make use of supervised classification approaches and resort to techniques, such as histograms of oriented gradients (HOG), principal component analysis (PCA), and Gabor filters, among others. Unfortunately, existing approaches are lacking in two respects: first, comparison between methods using a common body of work has not been addressed; second, no study of the combination potentiality of popular features for vehicle classification has been reported. In this study the performance of the different techniques is first reviewed and compared using a common public database. Then, the combination capabilities of these techniques are explored and a methodology is presented for the fusion of classifiers built upon them, taking into account also the vehicle pose. The study unveils the limitations of single-feature based classification and makes clear that fusion of classifiers is highly beneficial for vehicle verification.
Resumo:
El paradigma de procesamiento de eventos CEP plantea la solución al reto del análisis de grandes cantidades de datos en tiempo real, como por ejemplo, monitorización de los valores de bolsa o el estado del tráfico de carreteras. En este paradigma los eventos recibidos deben procesarse sin almacenarse debido a que el volumen de datos es demasiado elevado y a las necesidades de baja latencia. Para ello se utilizan sistemas distribuidos con una alta escalabilidad, elevado throughput y baja latencia. Este tipo de sistemas son usualmente complejos y el tiempo de aprendizaje requerido para su uso es elevado. Sin embargo, muchos de estos sistemas carecen de un lenguaje declarativo de consultas en el que expresar la computación que se desea realizar sobre los eventos recibidos. En este trabajo se ha desarrollado un lenguaje declarativo de consultas similar a SQL y un compilador que realiza la traducción de este lenguaje al lenguaje nativo del sistema de procesamiento masivo de eventos. El lenguaje desarrollado en este trabajo es similar a SQL, con el que se encuentran familiarizados un gran número de desarrolladores y por tanto aprender este lenguaje no supondría un gran esfuerzo. Así el uso de este lenguaje logra reducir los errores en ejecución de la consulta desplegada sobre el sistema distribuido al tiempo que se abstrae al programador de los detalles de este sistema.---ABSTRACT---The complex event processing paradigm CEP has become the solution for high volume data analytics which demand scalability, high throughput, and low latency. Examples of applications which use this paradigm are financial processing or traffic monitoring. A distributed system is used to achieve the performance requisites. These same requisites force the distributed system not to store the events but to process them on the fly as they are received. These distributed systems are complex systems which require a considerably long time to learn and use. The majority of such distributed systems lack a declarative language in which to express the computation to perform over incoming events. In this work, a new SQL-like declarative language and a compiler have been developed. This compiler translates this new language to the distributed system native language. Due to its similarity with SQL a vast amount of developers who are already familiar with SQL will need little time to learn this language. Thus, this language reduces the execution failures at the time the programmer no longer needs to know every single detail of the underlying distributed system to submit a query.