368 resultados para Apache Cordova
Resumo:
El proyecto Chachá trata de la recuperación de equipos informáticos obsoletos para poder ser usados como herramienta pedagógica. No se trata de llenar el centro de ordenadores, sino de buscar ubicaciones con las características adecuadas para que los alumnos puedan acceder a Internet o ejecutar programas ofimáticos corrientes. La procedencia de los equipos varía, algunos son donados por personas particulares y otros, por instituciones y organismos públicos y privados. Surge el problema de instalar un sistema operativo a los equipos. Se eligen sistemas y programas libres y gratuitos como el sistema operativo Linux. De esta forma, se consiguen puntos de acceso a Internet de muy bajo coste y que responden a la vez a las necesidades del centro y a su presupuesto. Estos nuevos ordenadores comparten carpetas con el resto de la red y ejecutan navegadores de páginas Web que potencialmente se convierten en herramientas educativas. Además el proyecto incluye la implantación de un servidor Apache con PHP y MySQL que aloja un gestor de contenidos Moodle, lo que hace que se ponga en marcha una plataforma educativa propia. De esta forma, los alumnos aprenden conceptos y procedimientos informáticos, además de valores como trabajar en equipo, cuidado y mantenimiento de los recursos del centro, la importancia del reciclaje y la reutilización y el afán de superación de las dificultades económicas por medio del esfuerzo y la constancia en el trabajo. Las actividades y la metodología hace partícipes a todos los alumnos en el proceso, desde la obtención y traslado de ordenadores al centro; recuperación física, de algunos equipos; diseño de la nueva ubicación dentro del instituto; dotar a las aulas de redes de datos y redes eléctricas; montaje de las mesas, también recicladas; conexión de los equipos y configuración del sistema operativo. También se refuerzan estrategias de enseñanza-aprendizaje como la comprensión lectora, adquisición de vocabulario y la expresión escrita, especialmente a través de Internet, se participa en foros y se crea un cuaderno de bitácora virtual, alojado en el servidor del centro y redactado por los propios alumnos que incluye todos las vicisitudes que afectan el desarrollo del proyecto. La evaluación está presente en el desarrollo del proyecto, ya que se trata de un trabajo experimental y por tanto, se somete a revisión constantemente para modificar errores. Se adjuntan junto a la memoria trece anexos y un CD-ROM que completan la información..
Resumo:
Resumen basado en el de la publicación
Resumo:
Elaboración de una aplicación informática alojada en un servidor web cuyos objetivos, entre otros, fueron: instalar un servidor Web Apache con soporte PHP y XML, realizar clases e interfaces en PHP con soporte AJAX, realizar una aplicación a partir de las clases generadas para la elaboración de la programación de una forma dinámica desde una página web y realizar cuadernos de clase por web a partir de una programación y un grupo de alumnos introducidos. El procedimiento se desarrolla en distintas fases: análisis de la forma de trabajo de los profesores en el centro, elaboración de módulos, utilización del programa por varios profesores indicando las dificultades surgidas y sus soluciones. A partir la información recogida se realiza la elaboración de la ayuda on-line del programa. El proyecto se evalúa a través de un test autoevaluativo señalándose como resultados la creación de un programa que ha facilitado la labor docente de una forma más amena y accesible que los métodos anteriores, con información puntual, detallada y actualizada, con la generación de informes inmediatos .
Resumo:
En 2009 la Unidad Municipal de Análisis Territorial del Ajuntament de Girona apostó por integrar al sistema de información territorial existente un conjunto de herramientas libres para publicar cartografía. Un año después, más de la mitad de los servicios de cartografía web están funcionando sobre este nuevo sistema. El nuevo sistema, llamado Coloma, se basa en una base de datos PostGIS, la librería de render Mapnik, el servidor WS Ogcserver, la librería Javascript OpenLayers, el framework Django, el lenguaje de programación Python y el servidor web Apache con el módulo para WSGI. Coloma incluye una herramienta de administración web para facilitar la configuración y la administración del día a día. Aporta pequeñas mejoras a los proyectos por separado y un valor añadido al conjunto, consiguiendo un servidor de cartografía formado por proyectos de éxito, robusto y con muchas posibilidades. Coloma ha demostrado ser fiable y sus componentes siguen mejorando día a día. El hecho de ser software libre nos ha permitido acceder al conocimiento, código y documentación que hay sobre todo este software en Internet. En los próximos meses la UMAT va a liberar el proyecto Coloma para que más gente pueda beneficiarse del trabajo hecho hasta ahora
Resumo:
El Centro Nacional de Tecnologías de la Información en colaboración con el Instituto Geográfico de Venezuela Simón Bolívar están abordado la construcción de la Infraestructura de Datos Espaciales de Venezuela, cuya primera fase, realizada por la Asociación gvSIG, ya está finalizada. Esta primera fase consiste en la implantación de la arquitectura del sistema y la construcción de un geoportal que permite acceder a la cartografía de referencia del país mediante servicios WMS, WFS, WCS y CSW. Todo el sistema utiliza tecnología libre. Esta primera fase se complementa con el desarrollo de un módulo formativo on-line para llevar a cabo la transferencia del conocimiento generado en el propio proyecto. Los componentes principales utilizados han sido: gvSIG como cliente SIGIDE avanzado, PostgreSQL+PostGIS como base de datos espacial, MapServer y Geoserver como servidores de mapas, Geonetwork como servidor de catálogo, OpenLayers+MapFish como cliente ligero (geoportal),Debian como sistema operativo,Apache como servidor web, TomCat como servidor de aplicaciones
Resumo:
CartoCiudad es una base de datos de red viaria, cartografía urbana einformación censal y postal generada a partir de datos de organismos oficiales de toda España que permite la localización de direcciones y el cálculo de rutas y áreas de influencia. El Instituto Geográfico Nacional (IGN), junto con las comunidades autónomas con las que se han firmado convenios de colaboración, es el encargado de integrar información procedente de la Dirección General del Catastro, del Instituto Nacional de Estadística, de la Sociedad Estatal de Correos y Telégrafos y de los organismos cartográficos de las comunidades autónomas con información procedente de sus bases de datos para formar CartoCiudad. Desde el principio del proyecto ha existido un especial interés en la utilización de software libre de Sistemas de Información Geográfica (SIG). Así lo demuestra la utilización de herramientas libres de código abierto para la publicación de los servicios web estándar de CartoCiudad: WMS utilizando GeoServer, WFS empleando Deegree, WPS utilizando 52º North y WMS-C con TileCache, todos ellos desplegados en servidores Apache y Apache Tomcat. Después de varios años utilizando las mismas herramientas de control de calidad y debido al cambio del modelo de datos en respuesta a las necesidades de los usuarios, se está trabajando en una nueva herramienta de edición y control de calidad de CartoCiudad sobre gvSIG en colaboración con proDEVELOP. Esto supone una actualización de la metodología de producción en CartoCiudad aprovechando el dinamismo y la flexibilidad de las soluciones de código abierto. Por último, se están analizando las ventajas e inconvenientes de migrar el repositorio de datos de CartoCiudad desde una base de datos Oracle (Spatial) a una PostgreSQL (PostGIS) en combinación con el módulo pgRouting, con el objetivo de contar con las mismas funcionalidades de las que se dispone en la actualidad
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.
Resumo:
New reconstructions of changing vegetation patterns in the Mediterranean-Black Sea Corridor since the Last Glacial Maximum are being produced by an improved biomisation scheme that uses both pollen and plant macrofossil data, in conjunction. Changes in fire regimes over the same interval will also be reconstructed using both microscopic and macroscopic charcoal remains. These reconstructions will allow a diagnosis of the interactions between climate, fire and vegetation on millennial timescales, and will also help to clarify the role of coastline and other geomorphic changes, salinity and impacts of human activities in this region. These new data sets are being produced as a result of collaboration between the Palynology Working Group (WG-2) within the IGCP-521 project and the international Palaeovegetation Mapping Project (BIOME 6000). The main objective of this paper is to present the goals of this cooperation, methodology, including limitations and planned improvements, and to show the initial results of some applications.
Resumo:
Background—A major problem in procurement of donor hearts is the limited time a donor heart remains viable. After cardiectomy, ischemic hypoxia is the main cause of donor heart degradation. The global myocardial ischemia causes a cascade of oxygen radical formation that cumulates in an elevation in hydrogen ions (decrease in pH), irreversible cellular injury, and potential microvascular changes in perfusion. Objective—To determine the changes of prolonged storage times on donor heart microvasculature and the effects of intermittent antegrade perfusion. Materials and Methods—Using porcine hearts flushed with a Ribosol-based cardioplegic solution, we examined how storage time affects microvascular myocardial perfusion by using contrast-enhanced magnetic resonance imaging at a mean (SD) of 6.1 (0.6) hours (n=13) or 15.6 (0.6) hours (n=11) after cardiectomy. Finally, to determine if administration of cardioplegic solution affects pH and microvascular perfusion, isolated hearts (group 1, n=9) given a single antegrade dose, were compared with hearts (group 2, n=8) given intermittent antegrade cardioplegia (150 mL, every 30 min, 150 mL/min) by a heart preservation device. Khuri pH probes in left and right ventricular tissue continuously measured hydrogen ion levels, and perfusion intensity on magnetic resonance images was plotted against time. Results—Myocardial perfusion measured via magnetic resonance imaging at 6.1 hours was significantly greater than at 15.6 hours (67% vs 30%, P= .00008). In group 1 hearts, the mean (SD) for pH at the end of 6 hours decreased to 6.2 (0.2). In group 2, hearts that received intermittent antegrade cardioplegia, pH at the end of 6 hours was higher at 6.7 (0.3) (P=.0005). Magnetic resonance imaging showed no significant differences between the 2 groups in contrast enhancement (group 1, 62%; group 2, 40%) or in the wet/dry weight ratio. Conclusion—Intermittent perfusion maintains a significantly higher myocardial pH than does a conventional single antegrade dose. This difference may translate into an improved quality of donor hearts procured for transplantation, allowing longer distance procurement, tissue matching, improved outcomes for transplant recipients, and ideally a decrease in transplant-related costs.
Resumo:
Fire activity has varied globally and continuously since the last glacial maximum (LGM) in response to long-term changes in global climate and shorter-term regional changes in climate, vegetation, and human land use. We have synthesized sedimentary charcoal records of biomass burning since the LGM and present global maps showing changes in fire activity for time slices during the past 21,000 years (as differences in charcoal accumulation values compared to pre-industrial). There is strong broad-scale coherence in fire activity after the LGM, but spatial heterogeneity in the signals increases thereafter. In North America, Europe and southern South America, charcoal records indicate less-than-present fire activity during the deglacial period, from 21,000 to ∼11,000 cal yr BP. In contrast, the tropical latitudes of South America and Africa show greater-than-present fire activity from ∼19,000 to ∼17,000 cal yr BP and most sites from Indochina and Australia show greater-than-present fire activity from 16,000 to ∼13,000 cal yr BP. Many sites indicate greater-than-present or near-present activity during the Holocene with the exception of eastern North America and eastern Asia from 8,000 to ∼3,000 cal yr BP, Indonesia and Australia from 11,000 to 4,000 cal yr BP, and southern South America from 6,000 to 3,000 cal yr BP where fire activity was less than present. Regional coherence in the patterns of change in fire activity was evident throughout the post-glacial period. These complex patterns can largely be explained in terms of large-scale climate controls modulated by local changes in vegetation and fuel load
Resumo:
Background and objectives: There have been few studies investigating acute kidney injury (AKI) in patients infected with the 2009 pandemic influenza A (H1N1) virus. Therefore, the objective of this study was to identify the factors associated with AKI in H1N1-infected patients. Design, setting, participants, & measurements: This was a study of 47 consecutive critically ill adult patients with reverse transcriptase-PCR-confirmed H1N1 infection in Brazil. Outcome measures were AKI (as defined by the Risk, Injury, Failure, Loss, and End-stage renal failure [RIFLE] criteria) and in-hospital death. Results: AKI was identified in 25 (53%) of the 47 H1N1-infected patients. AKI was associated with vasopressor use, mechanical ventilation, high Acute Physiology and Chronic Health Evaluation II (APACHE II) scores, and severe acidosis as well as with higher levels of C-reactive protein and lactic dehydrogenase upon intensive care unit (ICU) admission. A nephrology consultation was requested for 16 patients (64%), and 8 (50%) required dialysis. At ICU admission, 7 (15%) of the 25 AKI patients had not yet progressed to AKI. However, by 72 hours after ICU admission, no difference in RIFLE score was found between AKI survivors and nonsurvivors. Of the 47 patients, 9 (19%) died, all with AKI. Mortality was associated with mechanical ventilation, vasopressor use, dialysis, high APACHE II score, high bilirubin levels, and a low RIFLE score at ICU admission. Conclusions: Among critically ill H1N1-infected patients, the incidence of AKI is high. In such patients, AKI is mainly attributable to shock. Clin J Am Soc Nephrol 5: 1916-1921, 2010. doi: 10.2215/CJN.00840110
Resumo:
A necessidade de estimar com precisão o gasto energético dos pacientes gravemente doentes é cada vez mais importante para que se possa planejar uma nutrição adequada. Está bem estabelecido que tanto a desnutrição quanto o excesso alimentar prejudicam a evolução favorável destes doentes, especialmente quando estão sob ventilação mecânica. O objetivo do presente estudo foi comparar o Gasto Energético Total (GET) dos pacientes ventilados mecanicamente nos modos controlado e assistido, através da calorimetria indireta, medidos pelos monitores de gases TEEM-100 e DATEX-OHMEDA, verificando a necessidade de ajuste no aporte calórico em cada modo, correlacionando-os com a equação de Harris-Benedict (H-B). Foram estudados 100 pacientes em que os gases exalados (CO2 e O2) foram medidos durante 20 minutos em cada modo ventilatório (controlado e assistido) e o gasto energético calculado pela fórmula de “Weir”, determinando o GET, em 24 horas e comparado com o GET estimado pela equação de H-B. A média do escore APACHE II foi 21,1± 8,3. A média dos valores estimados pela equação de H-B foi de 1853,87±488,67 Kcal/24 h, considerando os fatores de atividade e estresse. Os valores médios obtidos pela calorimetria indireta (CI) foram de 1712,76 ±491,95 Kcal/24 h para a modalidade controlada e de 1867,33±542,67 Kcal/24 h para a assistida. A média do GET na modalidade assistida foi de 10,71% maior do que na controlada (p<0,001). A comparação das médias do GET, obtidos por CI com a equação de H-B ajustada para fatores de atividade e estresse demonstraram que a equação superestimou em 141,10 Kcal/24 h (8,2%) (p=0,012), quando na modalidade controlada. Retirando-se os fatores de atividade, observou-se uma tendência não significativa a subestimar em 44,28 Kcal/24 h (2,6%) (p=0,399). Quando na modalidade assistida, a comparação com H-B sem o fator de atividade, a medida por CI subestima em 198,84 Kcal/24 h, (10,71%), (p=0,001), enquanto que com o fator de atividade também subestimam, mas em 13,46 Kcal/24 h (0,75%) sem significância estatística (p=0,829). As diferenças observadas com o uso ou não de vasopressor, presença ou não de infecção e sepse como causa de internação na UTI, o tipo de dieta parenteral ou enteral ou sem dieta, e as faixas etárias não tiveram significância estatística, portanto não tiveram influência no gasto energético entre os modos ventilatórios. Os homens quando ventilando no modo controlado gastaram mais energia em relação às mulheres (1838,08 vs. 1577,01; p=0,007). Concluindo-se, os dados sugerem que devemos considerar o fator de atividade de 10%, somente quando em VM assistida, uma vez que este fator de atividade determina hiperalimentação, quando no modo controlado.