935 resultados para face recognition,face detection,face verification,web application
Resumo:
Dado el posicionamiento de nuestro país en producción de Uchuva a nivel mundial y, gracias a la calidad con la que se obtiene el producto, se ha encontrado una oportunidad de negocio que permita incrementar las exportaciones, expandir el mercado y aumentar el reconocimiento del producto, trayendo consigo beneficios y ganancias para el país y sus productores. Lo que se busca con éste proyecto es proponer un plan de exportaciones viable y exitoso, evaluando factores técnicos y económicos que permitan el desarrollo del mismo. Sin embargo, a pesar de ser la Uchuva importante dentro de la producción agrícola en el país, ésta no es explotada como debería. El mercado internacional es bastante amplio, y dado que la fruta es exótica y llamativa por sus propiedades, hace que sea de interés para el mundo, generando que el mercado pueda expandirse, incrementando así las exportaciones. Es por esto que se ha propuesto otorgar lineamientos para que dichas exportaciones sean viables y traigan consigo los mejores beneficios y ganancias para los productores de la fruta. Además, se quieren implementar estrategias que generen exportaciones exitosas, que permitan al sector agrícola incrementar su producción, al igual que, aumentar el desarrollo del sector, generando un incremento económico y un aumento de empleos, dos factores que traen consigo el desarrollo, no solo de la gente que trabaja el producto, sino del país en general. Todo esto, enfocado hacia un país con baja producción agrícola y alto nivel de importaciones como Japón, país con características imponentes para implementar un plan de exportación de una fruta fresca, exótica, de excelente calidad y reconocimiento, como lo es la Uchuva colombiana.
Resumo:
El tema de investigación de esta monografía de grado, es un estudio nunca antes realizado, original, y que permitirá analizar por primera vez, en el ámbito universitario, el proceso que se está llevando a cabo en este país, y dará cuenta de cómo un Estado con una tradición de conflictos con sus vecinos, es capaz de neutralizar esta situación, en favor de una cooperación que beneficie sus intereses nacionales. Es por eso, que para este estudio, será necesario precisar las consecuencias económicas y políticas, positivas y negativas que ha desencadenado esta alianza. También, será necesario analizar la relación de Pakistán con algunos países islámicos como Irán, Arabia Saudita y Afganistán, para evaluar como ha sido tomada esta alianza por el mundo musulmán, y finalmente, establecer a manera de conclusión, un paralelo entre Costos y Beneficios, que nos permita establecer si Pakistán se ha beneficiado más de lo que se ha perjudicado de esta alianza, y si esta nueva sociedad formada desde 2001, es el motor generador de estabilidad en la región del sur de Asia. 2 Esta monografía, como ya se ha mencionado, trata un tema poco abordado que no ha sido estudiado antes, por lo que se hace necesaria la utilización de una investigación exploratoria, que lleve a familiarizarnos con fenómenos relativamente desconocidos. Por ello, para el desarrollo de este trabajo de grado se tomarán como ayuda bibliográfica revistas electrónicas, artículos de periódicos, internet y documentos oficiales de varios gobiernos disponibles en la web.
Resumo:
En el presente trabajo de grado, estudiaremos la utilidad de la Consulta Previa como instrumento político, a partir del análisis del proceso adelantado entre el Resguardo Santa Teresita del Tuparro y la Petrolera BHP Billiton en el año 2010, en el municipio de Cumaribo, Vichada, identificando si existen fortalezas y buenas prácticas que permitan equilibrar el interés general y el de las minorías, en beneficio de la gobernabilidad dentro del contexto colombiano. Para cumplir con el objetivo descrito, haremos un análisis de los diferentes roles de los actores involucrados en el presente estudio de caso, desde los conceptos de legitimidad y representatividad y analizaremos si la Consulta Previa adelantada de manera idónea, mediante la aplicación de principios como buena fe, transparencia, comunicación asertiva, y reconocimiento mutuo entre los actores, puede ser una herramienta eficaz, dentro de la democracia participativa.
Resumo:
La implementación de metodologías de biología molecular como la reacción en cadena de la polimerasa (PCR), ha permitido la realización de diagnósticos sensibles y específicos para múltiples enfermedades, dentro de las cuales son de gran interés las infecciosas. Hasta hoy, los métodos de identificación se basan principalmente en cultivos y serología por su sensibilidad y especificidad, pero consumen tiempo y dinero. Las muestras de orina se han constituido en una alternativa no invasiva de obtención de ADN para la realización de análisis de biología molecular. Metodología: Implementación de una estrategia para la obtención de ADN a partir de muestras de orina. Las muestras fueron tomadas de niños de guardería, para documentar la presencia o no de inhibidores de PCR a través de la amplificación de genes de Citomegalovirus humano (CMVH). Resultados: En el 27,1% de las muestras analizadas se evidenció amplificación específica para CMVH, no se encontraron diferencias significativas en la presencia del virus en los tres estratos, pero sí en la intensidad de las bandas. Conclusión: Se verificó la ausencia de inhibidores de PCR mediante la amplificación del gen de la B-globina. Se estandarizó una metodología molecular para la identificación de CMVH, la cual puede ser aplicada
Resumo:
Se conformó el primer archivo docente digital de patología específica del seno en la modalidad de mamografía a nivel nacional, el cuál permitirá el entrenamiento de radiólogos y residentes de radiología según el sistema de lectura BI-RADS, buscando la unificación de criterios y mejoría de las competencias en la interpretación de las imágenes con la finalidad de aumentar la detección temprana del carcinoma de seno
Resumo:
La principal contribución del trabajo es estudiar como representar la información turística de manera que sea fácilmente interpretable por programas de cálculo de rutas. Esto permitirá implementar asistentes que creen rutas turísticas en función de los gustos del usuario y que proporcionen información sobre sus puntos de interés de forma transparente con independencia de su formato o ubicación en la web
Resumo:
Extracte de les notícies sobre tesis llegides que al llarg del 2011 s'han publicat a la web.
Resumo:
"Student’s Watcher” is a small Web application which wants to show in a visual, simple and fast way, the evolution of the students. The main project table displays such things as marks and comments about students. We can add a comment for each mark to explain why this mark. The objective is to be able to know if some student has a problem, how is going his year, marks in other courses, or even, to know if he has a bad week in a different subjects. We can see the evolution of students in past years to do an objective comparison. It also allows inserting global comments of student, we have a list of these, and all professors can add new ones, where we can see more general valuations. “Student’s Watcher” was begun in ASP.net, but finally my project would be developed in PHP, HTML and CSS. This project wants to be a comparison between two of most important languages used nowadays, ASPX and PHP
Resumo:
G-Rex is light-weight Java middleware that allows scientific applications deployed on remote computer systems to be launched and controlled as if they are running on the user's own computer. G-Rex is particularly suited to ocean and climate modelling applications because output from the model is transferred back to the user while the run is in progress, which prevents the accumulation of large amounts of data on the remote cluster. The G-Rex server is a RESTful Web application that runs inside a servlet container on the remote system, and the client component is a Java command line program that can easily be incorporated into existing scientific work-flow scripts. The NEMO and POLCOMS ocean models have been deployed as G-Rex services in the NERC Cluster Grid, and G-Rex is the core grid middleware in the GCEP and GCOMS e-science projects.
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.
Resumo:
Web Services for Remote Portlets (WSRP) is gaining attention among portal developers and vendors to enable easy development, increased richness in functionality, pluggability, and flexibility of deployment. Whilst currently not supporting all WSRP functionalities, open-source portal frameworks could in future use WSRP Consumers to access remote portlets found from a WSRP Producer registry service. This implies that we need a central registry for the remote portlets and a more expressive WSRP Consumer interface to implement the remote portlet functions. This paper reports on an investigation into a new system architecture, which includes a Web Services repository, registry, and client interface. The Web Services repository holds portlets as remote resource producers. A new data structure for expressing remote portlets is found and published by populating a Universal Description, Discovery and Integration (UDDI) registry. A remote portlet publish and search engine for UDDI has also been developed. Finally, a remote portlet client interface was developed as a Web application. The client interface supports remote portlet features, as well as window status and mode functions. Copyright (c) 2007 John Wiley & Sons, Ltd.
Resumo:
It has been proposed that growing crop varieties with higher canopy albedo would lower summer-time temperatures over North America and Eurasia and provide a partial mitigation of global warming ('bio-geoengineering') (Ridgwell et al 2009 Curr. Biol. 19 1–5). Here, we use a coupled ocean–atmosphere–vegetation model (HadCM3) with prescribed agricultural regions, to investigate to what extent the regional effectiveness of crop albedo bio-geoengineering might be influenced by a progressively warming climate as well as assessing the impacts on regional hydrological cycling and primary productivity. Consistent with previous analysis, we find that the averted warming due to increasing crop canopy albedo by 0.04 is regionally and seasonally specific, with the largest cooling of ~1 °C for Europe in summer whereas in the low latitude monsoonal SE Asian regions of high density cropland, the greatest cooling is experienced in winter. In this study we identify potentially important positive impacts of increasing crop canopy albedo on soil moisture and primary productivity in European cropland regions, due to seasonal increases in precipitation. We also find that the background climate state has an important influence on the predicted regional effectiveness of bio-geoengineering on societally-relevant timescales (ca 100 years). The degree of natural climate variability and its dependence on greenhouse forcing that are evident in our simulations highlights the difficulties faced in the detection and verification of climate mitigation in geoengineering schemes. However, despite the small global impact, regionally focused schemes such as crop albedo bio-geoengineering have detection advantages.
Resumo:
The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The architecture of the new system uses Java language as programming environment. Since application parameters and hardware in a joint experiment are complex with a large variability of components, requirements and specification solutions need to be flexible and modular, independent from operating system and computer architecture. To describe and organize the information on all the components and the connections among them, systems are developed using the extensible Markup Language (XML) technology. The communication between clients and servers uses remote procedure call (RPC) based on the XML (RPC-XML technology). The integration among Java language, XML and RPC-XML technologies allows to develop easily a standard data and communication access layer between users and laboratories using common software libraries and Web application. The libraries allow data retrieval using the same methods for all user laboratories in the joint collaboration, and the Web application allows a simple graphical user interface (GUI) access. The TCABR tokamak team in collaboration with the IPFN (Instituto de Plasmas e Fusao Nuclear, Instituto Superior Tecnico, Universidade Tecnica de Lisboa) is implementing this remote participation technologies. The first version was tested at the Joint Experiment on TCABR (TCABRJE), a Host Laboratory Experiment, organized in cooperation with the IAEA (International Atomic Energy Agency) in the framework of the IAEA Coordinated Research Project (CRP) on ""Joint Research Using Small Tokamaks"". (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
In today's society it is becoming more and more important with direct marketing. Some of the direct marketing is done through e-mail, in which companies see an easy way to advertise himself. I did this thesis work at WebDoc Systems. They have a product that creates web documents directly in your browser, also called CMS. The CMS has a module for sending mass e-mail, but this module does not function properly and WebDoc Systems customers are dissatisfied with that part of the product. The problem with the module was that sometimes it didn't send the e-mail, and that it was not possible to obtain some form of follow-up information on the e-mail. The goal of this work was to develop a Web service that could easily send e-mail to many receivers, just as easily be able to view statistics on how mailing has gone. The first step was to do a literature review to get a good picture of available programming platforms, but also to be able create a good application infrastructure. The next step was to implement this design and improve it over time by using an iterative development methodology. The result was an application infrastructure that consists of three main parts and a plugin interface. The parts that were implemented were a Web service application, a Web application and a Windows service application. The three elements cooperate with each other and share a database, and plugins.
Resumo:
The report analyses if some common problems can be avoided by using modern technology. As a reference system “Fartygsrapporteringssystemet” is used. It is an n-tier web application built with modern technology at time, 2003-2004. The aim is to examine whether ASP.Net MVC, Windows Communication Foundation, Workflow Foundation and SQL Server 2005 Service Broker can be used to create an n-tier web application which also communicate with other systems and facilitate automated testing. The report describes the construction of a prototype in which the presentation layer uses ASP.Net MVC to separate presentation and business logic. Communication with the business layer is done through the Windows Communication Foundation. Hard coded processes are broken out and dealt with by Workflow Foundation. Asynchronous communication with other systems is done by using Microsoft SQL Server 2005 Service Broker. The results of the analysis is that these techniques can be used to create a n-tier web application, but that ASP.Net MVC, which at present only available in a preview release, is not sufficiently developed yet.