45 resultados para introductory programming, learning to program, programming pedagogy, collaborative learning, pair-programming


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to improve collaboration within working teams, several collaborative tools are being deployed in most of the modern organizations; for this improvement to occur, employees also have to perform an appropriate collaborative behavior. The objective of this exploratory study is to measure and characterize this collaborative behavior taking into account several factors. In order to assess predictive ability for the research model, we have developed a theoretical model, which has been validated with data from 86 employees from the department of Information Systems of a large industrial company based in Spain. Findings from the results show that altruism, common objectives and mutual trust positively predict collaborative behavior, while the sense of belonging to a community, reputation and reciprocity do not.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Some verification and validation techniques have been evaluated both theoretically and empirically. Most empirical studies have been conducted without subjects, passing over any effect testers have when they apply the techniques. We have run an experiment with students to evaluate the effectiveness of three verification and validation techniques (equivalence partitioning, branch testing and code reading by stepwise abstraction). We have studied how well able the techniques are to reveal defects in three programs. We have replicated the experiment eight times at different sites. Our results show that equivalence partitioning and branch testing are equally effective and better than code reading by stepwise abstraction. The effectiveness of code reading by stepwise abstraction varies significantly from program to program. Finally, we have identified project contextual variables that should be considered when applying any verification and validation technique or to choose one particular technique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this report is to build a model that represents, as best as possible, the seismic behavior of a pile cap bridge foundation by a nonlinear static (analysis) procedure. It will consist of a reproduction of a specimen already built in the laboratory. This model will carry out a pseudo static lateral and horizontal pushover test that will be applied onto the pile cap until the failure of the structure, the formation of a plastic hinge in the piles due to the horizontal deformation, occurs. The pushover test consists of increasing the horizontal load over the pile cap until the horizontal displacement wanted at the height of the pile cap is reached. The output of this model will be a Skeleton curve that will plot the lateral load (kN) over the displacement (m), so that the maximum movement the pile cap foundation can reach before its failure can be calculated. This failure will be achieved when the load at that specific shift is equal to 85% of the maximum. The pile cap foundation finite element model was based on pile cap built for a laboratory experiment already carried out by the Master student Deming Zhang at Tongji University. Two different pile caps were tested with a difference in height above the ground level. While one has 0:3m, the other rises 0:8m above the ground level. The computer model was calibrated using the experimental results. The pile cap foundation will be programmed in a finite element environment called OpenSees (Open System for Earthquake Engineering Simulation [28]). This environment is a free software developed by Berkeley University specialized, as it name says, in the study of earthquakes and its effects on structures. This specialization is the main reason why it is being used for building this model as it makes it possible to build any finite element model, and perform several analysis in order to get the results wanted. The development of OpenSees is sponsored by the Pacific Earthquake Engineering Research Center through the National Science Foundation engineering and education centers program. OpenSees uses Tcl language to program it, which is a language similar to C++.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La empresa social es un modelo organizativo que presenta un interesante potencial para resolver problemáticas sociales. La empresa social ha despertado interés tanto en países industrializados como en economías en vías de desarrollo porque representa un modelo dentro del capitalismo que persigue objetivos sociales mediante la realización de actividades de mercado (compra y venta de productos y/o servicios principalmente). A pesar de sus raíces lejanas en el tiempo se trata de un campo de conocimiento relativamente joven, donde la literatura académica presenta escasez de estudios empíricos. El desarrollo teórico para buscar claridad conceptual ha sido el principal caballo de batalla de los últimos años, y por tanto, se ha prestado poca atención a generar evidencias sobre cómo funcionan las empresas sociales y sobre sus claves de su éxito. Se considera que la mejora en la comprensión de este modelo organizativo pasa por la construcción de herramientas para que académicos y practicantes mejoren su conocimiento sobre los mecanismos internos de las empresas sociales. En este contexto nace la presente tesis doctoral sobre empresa social, que tiene por objetivo la creación de un marco de análisis que permita el estudio de las empresas sociales desde una dimensión organizativa, es decir, que aborde los elementos clave que describen el funcionamiento de este tipo de organizaciones. Para ello, en este trabajo se aborda la construcción del modelo para el análisis organizativo de las empresas sociales a partir del análisis semántico de las 45 principales definiciones de empresa social. A partir de este análisis se identifican dos dimensiones de análisis de la empresa social: -Cuatro principios, comunes a todas las manifestaciones del fenómeno, que recogen la esencia del concepto. -Ocho elementos organizativos específicos de la empresa social que describen la forma en la que cada iniciativa se implementa en un contexto determinado. Es decir, elementos de diseño presentes en diferente medida que dan lugar a tipologías de empresa social diferentes. Estos elementos son: la proposición de valor social, la búsqueda de impacto a largo plazo, la cultura organizativa, la conexión con los beneficiarios, el liderazgo emprendedor y los mecanismos de gobernanza, el ecosistema colaborativo, la estrategia empresarial y la orientación a la autosuficiencia económica. A partir de este marco de análisis, se construyen dos herramientas de diagnóstico que permiten su aplicación al estudio de empresas sociales: una tabla de indicadores para el análisis externo (por parte de un investigador ajeno a la organización) y un cuestionario de diagnóstico para el análisis interno (a través del personal de la empresa social objeto de estudio). Las herramientas intentan dar respuesta a la necesidad de desarrollar constructos para el estudio empírico de las empresas sociales. Para analizar la utilidad del modelo y de las herramientas se llevaron a cabo tres estudios de caso: -La empresa social ACCIONA Microenergía Perú que proporciona energía eléctrica a comunidades rurales aisladas en la región peruana de Cajamarca. -La empresa social Integra-e que propone un mecanismo de inserción socio-laboral en Madrid para jóvenes en riesgo de exclusión a través de la formación en Tecnologías de la Información y la Comunicación (TIC). -Un conjunto de redes de telecentros pertenecientes a la red LAC de la fundación Telecentres.org que proporcionan acceso a servicios de información (Internet entre otros) en diferentes países de Latinoamérica. La aplicación de las herramientas mostró ser útil en los tres estudios de caso para obtener una relación de evidencias con las que analizar la proximidad de una organización al ideal de empresa social. El ejercicio de análisis también resultó interesante como ejercicio reflexivo para las entidades participantes. Los resultados del cuestionario fueron especialmente interesantes en los telecentros de la Fundación Telecentre.org ya que al ser un estudio multicaso se pudo realizar un rico análisis estadístico sobre el funcionamiento de los telecentros y su desempeño. El estudio permitió identificar relaciones interesantes entre los ocho elementos de diseño del modelo propuesto y el desempeño de la organización. En particular, se detectó que para todos los casos estudiados: -La dimensión económica es la componente del desempeño que mayor desafíos plantea. -La existencia de una alta correlación entre el desempeño y siete de los ocho elementos organizativos del modelo. -La importancia de la cultura organizativa como elemento que explica el desempeño global de la organización y la satisfacción de los empleados. El campo de la empresa social presenta importantes retos de futuro, como la claridad conceptual, el desarrollo de estudios empíricos y la medida de su impacto social. El conocimiento de las claves organizativas puede ayudar a diseñar empresas sociales más robustas o a que organizaciones con fines sociales que no se basan en mecanismos de mercado consideren la posibilidad de incorporar éstos en su estrategia. ABSTRACT Social enterprise is an organizational model with a strong potential to help solving social problems. Recently, interest for the model has risen in both industrialized and developing countries because it is organized to achieve altruistic or social goals through market activities (mainly sales of products and services). Despite its historic roots, it is a relatively young field of research, where academic literature has little empirical data to accompany the theoretical development of social enterprise. Conceptual clarification has been the main challenge during the recent years, and there has been little attention given to generate evidence on how social enterprises operate and their keys to success. Progress in empirical study involves the construction of tools for researchers, in order to increase understanding of the internal mechanisms of social enterprises. This thesis aims to create a conceptual framework to study social enterprises from an organizational point of view, by analyzing the key elements that explain the operation and organization of this organizational model. The framework for the organizational analysis of social enterprises was built supported by the semantic analysis of 45 main definitions of social enterprise. The framework is divided into two dimensions: -There are four principles which capture the essence of the social enterprise concept, and are present in the manifestations of cases. -There are eight design elements which help analyze the characteristics of each particular social enterprise initiative: the social value proposition, social impact orientation, organizational culture, links to beneficiaries, entrepreneurial leadership, collaborative ecosystem, entrepreneurial strategy and orientation to economic self-sufficiency. Two diagnostic tools were developed to apply the framework to case studies: a scoreboard of indicators (to be used by the researcher during external analysis of the organization) and a questionnaire (to be answered by the social enterprise staff). The dissertation undertakes the study of three case studies: -ACCIONA Microenergia Peru, a social enterprise that provides electricity to isolated rural communities in the Peruvian region of Cajamarca. -Integra-e, a social enterprise located in Madrid that promotes socioprofessional integration of young people through training in ICT. -A sample of telecenters of the LAC network that provide access to information services (such as Internet) in Latin America. Applying the tools proved to be useful in all three cases, because it helped to obtain evidence to compare the proximity of an organization to an ideal type of social enterprise. In all the cases studied, the economic sustainability proved to be the biggest challenge for the organizations. The application of the questionnaire to the telecenters was especially informative because it was a multicase study which provided a rich statistical analysis on the performance of call centers. The study identified unique relationships between the model elements and the organziation performance. A statistical analysis shows a high correlation between performance and seven organizational elements described in the model. The organizational culture seems to be an important factor in explaining the overall organizational performance and employee satisfaction. The field of social enterprise has significant future challenges -such as conceptual clarity, the development of empirical studies and social impact assessment. A deep understanding of key organizational aspects of social enterprises can help in the design of more robust organizations and to bring success to social-purpose organizations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Carbon (C) and nitrogen (N) process-based models are important tools for estimating and reporting greenhouse gas emissions and changes in soil C stocks. There is a need for continuous evaluation, development and adaptation of these models to improve scientific understanding, national inventories and assessment of mitigation options across the world. To date, much of the information needed to describe different processes like transpiration, photosynthesis, plant growth and maintenance, above and below ground carbon dynamics, decomposition and nitrogen mineralization. In ecosystem models remains inaccessible to the wider community, being stored within model computer source code, or held internally by modelling teams. Here we describe the Global Research Alliance Modelling Platform (GRAMP), a web-based modelling platform to link researchers with appropriate datasets, models and training material. It will provide access to model source code and an interactive platform for researchers to form a consensus on existing methods, and to synthesize new ideas, which will help to advance progress in this area. The platform will eventually support a variety of models, but to trial the platform and test the architecture and functionality, it was piloted with variants of the DNDC model. The intention is to form a worldwide collaborative network (a virtual laboratory) via an interactive website with access to models and best practice guidelines; appropriate datasets for testing, calibrating and evaluating models; on-line tutorials and links to modelling and data provider research groups, and their associated publications. A graphical user interface has been designed to view the model development tree and access all of the above functions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El objetivo de esta tesis doctoral es la investigación del nuevo concepto de pinzas fotovoltaicas, es decir, del atrapamiento, ordenación y manipulación de partículas en las estructuras generadas en la superficie de materiales ferroeléctricos mediante campos fotovoltaicos o sus gradientes. Las pinzas fotovoltaicas son una herramienta prometedora para atrapar y mover las partículas en la superficie de un material fotovoltaico de una manera controlada. Para aprovechar esta nueva técnica es necesario conocer con precisión el campo eléctrico creado por una iluminación específica en la superficie del cristal y por encima de ella. Este objetivo se ha dividido en una serie de etapas que se describen a continuación. La primera etapa consistió en la modelización del campo fotovoltaico generado por iluminación no homogénea en substratos y guías de onda de acuerdo al modelo de un centro. En la segunda etapa se estudiaron los campos y fuerzas electroforéticas y dielectroforéticas que aparecen sobre la superficie de substratos iluminados inhomogéneamente. En la tercera etapa se estudiaron sus efectos sobre micropartículas y nanopartículas, en particular se estudió el atrapamiento superficial determinando las condiciones que permiten el aprovechamiento como pinzas fotovoltaicas. En la cuarta y última etapa se estudiaron las configuraciones más eficientes en cuanto a resolución espacial. Se trabajó con distintos patrones de iluminación inhomogénea, proponiéndose patrones de iluminación al equipo experimental. Para alcanzar estos objetivos se han desarrollado herramientas de cálculo con las cuales obtenemos temporalmente todas las magnitudes que intervienen en el problema. Con estas herramientas podemos abstraernos de los complicados mecanismos de atrapamiento y a partir de un patrón de luz obtener el atrapamiento. Todo el trabajo realizado se ha llevado a cabo en dos configuraciones del cristal, en corte X ( superficie de atrapamiento paralela al eje óptico) y corte Z ( superficie de atrapamiento perpendicular al eje óptico). Se ha profundizado en la interpretación de las diferencias en los resultados según la configuración del cristal. Todas las simulaciones y experimentos se han realizado utilizando como soporte un mismo material, el niobato de litio, LiNbO3, con el f n de facilitar la comparación de los resultados. Este hecho no ha supuesto una limitación en los resultados pues los modelos no se limitan a este material. Con respecto a la estructura del trabajo, este se divide en tres partes diferenciadas que son: la introducción (I), la modelización del atrapamiento electroforético y dielectroforético (II) y las simulaciones numéricas y comparación con experimentos (III). En la primera parte se fijan las bases sobre las que se sustentarán el resto de las partes. Se describen los efectos electromagnéticos y ópticos a los que se hará referencia en el resto de los capítulos, ya sea por ser necesarios para describir los experimentos o, en otros casos, para dejar constancia de la no aparición de estos efectos para el caso en que nos ocupa y justificar la simplificación que en muchos casos se hace del problema. En esta parte, se describe principalmente el atrapamiento electroforético y dielectroforético, el efecto fotovoltaico y las propiedades del niobato de litio por ser el material que utilizaremos en experimentos y simulaciones. Así mismo, como no debe faltar en ninguna investigación, se ha analizado el state of the art, revisando lo que otros científicos del campo en el que estamos trabajando han realizado y escrito con el fin de que nos sirva de cimiento a la investigación. Con el capítulo 3 finalizamos esta primera parte describiendo las técnicas experimentales que hoy en día se están utilizando en los laboratorios para realizar el atrapamiento de partículas mediante el efecto fotovoltaico, ya que obtendremos ligeras diferencias en los resultados según la técnica de atrapamiento que se utilice. En la parte I I , dedicada a la modelización del atrapamiento, empezaremos con el capítulo 4 donde modelizaremos el campo eléctrico interno de la muestra, para a continuación modelizar el campo eléctrico, los potenciales y las fuerzas externas a la muestra. En capítulo 5 presentaremos un modelo sencillo para comprender el problema que nos aborda, al que llamamos Modelo Estacionario de Separación de Carga. Este modelo da muy buenos resultados a pesar de su sencillez. Pasamos al capítulo 6 donde discretizaremos las ecuaciones que intervienen en la física interna de la muestra mediante el método de las diferencias finitas, desarrollando el Modelo de Distribución de Carga Espacial. Para terminar esta parte, en el capítulo 8 abordamos la programación de las modelizaciones presentadas en los anteriores capítulos con el fn de dotarnos de herramientas para realizar las simulaciones de una manera rápida. En la última parte, III, presentaremos los resultados de las simulaciones numéricas realizadas con las herramientas desarrolladas y comparemos sus resultados con los experimentales. Fácilmente podremos comparar los resultados en las dos configuraciones del cristal, en corte X y corte Z. Finalizaremos con un último capítulo dedicado a las conclusiones, donde resumiremos los resultados que se han ido obteniendo en cada apartado desarrollado y daremos una visión conjunta de la investigación realizada. ABSTRACT The aim of this thesis is the research of the new concept of photovoltaic or optoelectronic tweezers, i.e., trapping, management and manipulation of particles in structures generated by photovoltaic felds or gradients on the surface of ferroelectric materials. Photovoltaic tweezers are a promising tool to trap and move the particles on the surface of a photovoltaic material in a monitored way. To take advantage of this new technique is necessary to know accurately the electric field created by a specifc illumination in the crystal surface and above it. For this purpose, the work was divided into the stages described below. The first stage consisted of modeling the photovoltaic field generated by inhomogeneous illumination in substrates and waveguides according to the one-center model. In the second stage, electrophoretic and dielectrophoretic fields and forces appearing on the surface of substrates and waveguides illuminated inhomogeneously were studied. In the third stage, the study of its effects on microparticles and nanoparticles took place. In particular, the trapping surface was studied identifying the conditions that allow its use as photovoltaic tweezers. In the fourth and fnal stage the most efficient configurations in terms of spatial resolution were studied. Different patterns of inhomogeneous illumination were tested, proposing lightning patterns to the laboratory team. To achieve these objectives calculation tools were developed to get all magnitudes temporarily involved in the problem . With these tools, the complex mechanisms of trapping can be simplified, obtaining the trapping pattern from a light pattern. All research was carried out in two configurations of crystal; in X section (trapping surface parallel to the optical axis) and Z section (trapping surface perpendicular to the optical axis). The differences in the results depending on the configuration of the crystal were deeply studied. All simulations and experiments were made using the same material as support, lithium niobate, LiNbO3, to facilitate the comparison of results. This fact does not mean a limitation in the results since the models are not limited to this material. Regarding the structure of this work, it is divided into three clearly differentiated sections, namely: Introduction (I), Electrophoretic and Dielectrophoretic Capture Modeling (II) and Numerical Simulations and Comparison Experiments (III). The frst section sets the foundations on which the rest of the sections will be based on. Electromagnetic and optical effects that will be referred in the remaining chapters are described, either as being necessary to explain experiments or, in other cases, to note the non-appearance of these effects for the present case and justify the simplification of the problem that is made in many cases. This section mainly describes the electrophoretic and dielectrophoretic trapping, the photovoltaic effect and the properties of lithium niobate as the material to use in experiments and simulations. Likewise, as required in this kind of researches, the state of the art have been analyzed, reviewing what other scientists working in this field have made and written so that serve as a foundation for research. With chapter 3 the first section finalizes describing the experimental techniques that are currently being used in laboratories for trapping particles by the photovoltaic effect, because according to the trapping technique in use we will get slightly different results. The section I I , which is dedicated to the trapping modeling, begins with Chapter 4 where the internal electric field of the sample is modeled, to continue modeling the electric field, potential and forces that are external to the sample. Chapter 5 presents a simple model to understand the problem addressed by us, which is called Steady-State Charge Separation Model. This model gives very good results despite its simplicity. In chapter 6 the equations involved in the internal physics of the sample are discretized by the finite difference method, which is developed in the Spatial Charge Distribution Model. To end this section, chapter 8 is dedicated to program the models presented in the previous chapters in order to provide us with tools to perform simulations in a fast way. In the last section, III, the results of numerical simulations with the developed tools are presented and compared with the experimental results. We can easily compare outcomes in the two configurations of the crystal, in section X and section Z. The final chapter collects the conclusions, summarizing the results that were obtained in previous sections and giving an overview of the research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this project is to create a website which is useful both employees and students of a university, so employees can add information, if they log in with username and password access, and students can view this information . Employees may modify and display information such as title, room, or their faculty (from a list defined by the administrator), and most importantly, their schedule, whether class, tutoring, free time, or any of tasks that the administrator define. There will be a manager, responsible for managing employees, the availables faculties and the types of tasks that employees can use on their schedule. Students may see the employees schedules and rooms on the homepage. They differentiate between differents tasks of employees, because these are in different colors. They can also filter information for faculty, employee or day. To achieve our goal, we decided to program in Java using Servlets, which we will use to generate response to requests coming from users from the website. We will also use JSP, allowing us to create different websites files. We use JSP files and not HTML, because we need that the pages are dynamic, since not only want to show specific information, we like that information can change depending on user requests. The JSP file allows us to generate HTML, but also using JAVA language, which is necessary for our purpose. As the information we store is not fixed. We want this information can be modified at any time by employees and admin, so we need a database, which can be accessed from anywhere. We decided SQLite databases because are integrated quite well in our application, and offer a quick response. To access the database from our program, we simply connect it to the database, and with very few lines of code, add, delete or modify entries in different tables that owns the database. To facilitate the initial creation of the database, and the first tables, we use a Mozilla Firefox browser plugin, called SQLite Manager, which allows us to do so from a more friendly interface. Finally, we need a server that supports and implements specifications Servlets and JSP. We decided on the TomCat server, which is a container Servlets, because is free, easy to use, and compatible with our program. We realized all the project with Eclipse environment, also free program that allows integrating database, server and program the JSP and Servlets. Once submitted all the tools we used, we must first organize the structure of the web, relating each Servlets with JSP files. Next, create the database and the different Servlets, and adjust the database accesses to make sure we do it right. From here simply is to build up the page step by step, showing in each place we need, and redirect to different pages. In this way, we can build a complex website, free, and without being an expert in the field. RESUMEN. El objetivo de este proyecto, es crear una página web que sirva tanto a empleados como a alumnos de una universidad, de tal manera que los empleados podrán añadir información, mediante el acceso con usuario y contraseña, y los alumnos podrán visualizar está información. Los empleados podrán modificar y mostrar información como su título, despacho, facultad a la que pertenecen (de entre una lista definida por el administrador), y lo más importante, sus horarios, ya sean de clase, tutorías, tiempo libre, o cualquiera de las tareas que el administrador defina. Habrá un administrador, encargado de gestionar los empleados existentes, las facultades disponibles y los tipos de tareas que podrán usar los empleados en su horario. Los alumnos, podrán visualizar los horarios y despacho de los empleados en la página principal. Diferenciarán entre las distintas tareas de los profesores, porque estas se encuentran en colores diferentes. Además, podrán filtrar la información, por facultad, empleado o día de la semana. Para conseguir nuestro objetivo, hemos decidido programar en Java, mediante el uso de Servlets, los cuales usaremos para generar respuesta antes las peticiones que llegan de los usuarios desde la página web. También usaremos archivos JSP, que nos permitirán crear las diferentes páginas webs. Usamos archivos JSP y no HTML, porque necesitamos que las diferentes páginas sean dinámicas, ya que no solo queremos mostrar una información concreta, si no que esta información puede variar en función de las peticiones de usuario. El archivo JSP nos permite generar HTML, pero a la vez usar lenguaje JAVA, algo necesario para nuestro cometido. Como la información que queremos almacenar no es fija, si no que en todo momento debe poder ser modificada por empleados y administrador, necesitamos una base de datos, a la que podamos acceder desde la web. Nos hemos decidido por bases SQLite, ya que se integran bastante bien en nuestra aplicación, y además ofrecen una rápida respuesta. Para acceder a la base de datos desde nuestro programa, simplemente debemos conectar el mismo a la base de datos, y con muy pocas líneas de código, añadir, eliminar o modificar entradas de las diferentes tablas que posee la base de datos. Para facilitar la creación inicial de la base de datos, y de las primeras tablas, usamos un complemento del navegador Mozilla Firefox, llamado SQLite Manager, que nos permite hacerlo desde una interfaz más amigable. Por último, necesitamos de un servidor que soporte e implemente las especificaciones de los Servlets y JSP. Nos decidimos por el servidor TomCat, que es un contenedor de Servlets gratuito, de fácil manejo, y compatible con nuestro programa. Todo el desarrollo del proyecto, lo realizamos desde el entorno Eclipse, programa también gratuito, que permite integrar la base de datos, el servidor y programar los JSP y Servlets. Una vez presentadas todas las herramientas que hemos utilizado, primero debemos organizar la estructura de la web, relacionando cada archivo JSP con los Servlets a los que debe acceder. A continuación creamos la base de datos y los diferentes Servlets, y ajustamos bien los accesos a la base de datos para comprobar que lo hacemos correctamente. A partir de aquí, simplemente es ir construyendo la página paso a paso, mostrando en cada lugar lo que necesitemos, y redirigiendo a las diferentes páginas. De esta manera, podremos construir una página web compleja, de manera gratuita, y sin ser un experto en la materia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El proyecto trata del desarrollo de un software para realizar el control de la medida de la distribución de intensidad luminosa en luminarias LED. En el trascurso del proyecto se expondrán fundamentos teóricos sobre fotometría básica, de los cuales se extraen las condiciones básicas para realizar dicha medida. Además se realiza una breve descripción del hardware utilizado en el desarrollo de la máquina, el cual se basa en una placa de desarrollo Arduino Mega 2560, que, gracias al paquete de Labview “LIFA” (Labview Interface For Arduino”), será posible utilizarla como tarjeta de adquisición de datos mediante la cual poder manejar tanto sensores como actuadores, para las tareas de control. El instrumento de medida utilizado en este proyecto es el BTS256 de la casa GigaHerzt-Optik, del cual se dispone de un kit de desarrollo tanto en lenguaje C++ como en Labview, haciendo posible programar aplicaciones basadas en este software para realizar cualquier tipo de adaptación a las necesidades del proyecto. El software está desarrollado en la plataforma Labview 2013, esto es gracias a que se dispone del kit de desarrollo del instrumento de medida, y del paquete LIFA. El objetivo global del proyecto es realizar la caracterización de luminarias LED, de forma que se obtengan medidas suficientes de la distribución de intensidad luminosa. Los datos se recogerán en un archivo fotométrico específico, siguiendo la normativa IESNA 2002 sobre formato de archivos fotométricos, que posteriormente será utilizado en la simulación y estudio de instalaciones reales de la luminaria. El sistema propuesto en este proyecto, es un sistema basado en fotometría tipo B, utilizando coordenadas VH, desarrollando un algoritmo de medida que la luminaria describa un ángulo de 180º en ambos ejes, con una resolución de 5º para el eje Vertical y 22.5º para el eje Horizontal, almacenando los datos en un array que será escrito en el formato exigido por la normativa. Una vez obtenidos los datos con el instrumento desarrollado, el fichero generado por la medida, es simulado con el software DIALux, obteniendo unas medidas de iluminación en la simulación que serán comparadas con las medidas reales, intentando reproducir en la simulación las condiciones reales de medida. ABSTRACT. The project involves the development of software for controlling the measurement of light intensity distribution in LEDs. In the course of the project theoretical foundations on basic photometry, of which the basic conditions for such action are extracted will be presented. Besides a brief description of the hardware used in the development of the machine, which is based on a Mega Arduino plate 2560 is made, that through the package Labview "LIFA" (Interface For Arduino Labview "), it is possible to use as data acquisition card by which to handle both sensors and actuators for control tasks. The instrument used in this project is the BTS256 of GigaHerzt-Optik house, which is available a development kit in both C ++ language as LabView, making it possible to program based on this software applications for any kind of adaptation to project needs. The software is developed in Labview 2013 platform, this is thanks to the availability of the SDK of the measuring instrument and the LIFA package. The overall objective of the project is the characterization of LED lights, so that sufficient measures the light intensity distribution are obtained. Data will be collected on a specific photometric file, following the rules IESNA 2002 on photometric format files, which will then be used in the simulation and study of actual installations of the luminaire. The proposed in this project is a system based on photometry type B system using VH coordinates, developing an algorithm as the fixture describe an angle of 180 ° in both axes, with a resolution of 5 ° to the vertical axis and 22.5º for the Horizontal axis, storing data in an array to be written in the format required by the regulations. After obtaining the data with the instrument developed, the file generated by the measure, is simulated with DIALux software, obtaining measures of lighting in the simulation will be compared with the actual measurements, trying to play in the simulation the actual measurement conditions .

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It has been proposed that the use of self-assembled quantum dot (QD) arrays can break the Shockley-Queisser efficiency limit by extending the absorption of solar cells into the low-energy photon range while preserving their output voltage. This would be possible if the infrared photons are absorbed in the two sub-bandgap QD transitions simultaneously and the energy of two photons is added up to produce one single electron-hole pair, as described by the intermediate band model. Here, we present an InAs/Al 0.25Ga 0.75As QD solar cell that exhibits such electrical up-conversion of low-energy photons. When the device is monochromatically illuminated with 1.32 eV photons, open-circuit voltages as high as 1.58 V are measured (for a total gap of 1.8 eV). Moreover, the photocurrent produced by illumination with photons exciting the valence band to intermediate band (VB-IB) and the intermediate band to conduction band (IB-CB) transitions can be both spectrally resolved. The first corresponds to the QD inter-band transition and is observable for photons of energy mayor que 1 eV, and the later corresponds to the QD intra-band transition and peaks around 0.5 eV. The voltage up-conversion process reported here for the first time is the key to the use of the low-energy end of the solar spectrum to increase the conversion efficiency, and not only the photocurrent, of single-junction photovoltaic devices. In spite of the low absorption threshold measured in our devices - 0.25 eV - we report open-circuit voltages at room temperature as high as 1.12 V under concentrated broadband illumination.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The purpose of this document is to serve as the printed material for the seminar "An Introductory Course on Constraint Logic Programming". The intended audience of this seminar are industrial programmers with a degree in Computer Science but little previous experience with constraint programming. The seminar itself has been field tested, prior to the writing of this document, with a group of the application programmers of Esprit project P23182, "VOCAL", aimed at developing an application in scheduling of field maintenance tasks in the context of an electric utility company. The contents of this paper follow essentially the flow of the seminar slides. However, there are some differences. These differences stem from our perception from the experience of teaching the seminar, that the technical aspects are the ones which need more attention and clearer explanations in the written version. Thus, this document includes more examples than those in the slides, more exercises (and the solutions to them), as well as four additional programming projects, with which we hope the reader will obtain a clearer view of the process of development and tuning of programs using CLP. On the other hand, several parts of the seminar have been taken out: those related with the account of fields and applications in which C(L)P is useful, and the enumerations of C(L)P tools available. We feel that the slides are clear enough, and that for more information on available tools, the interested reader will find more up-to-date information by browsing the Web or asking the vendors directly. More details in this direction will actually boil down to summarizing a user manual, which is not the aim of this document.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper presents an online C compiler designed so that students can program their practical assignments in Programming courses. What is really innovative is the self-assessment of the exercises based on black-box tests and train students’ skill to test software. Moreover, this tool lets instructors, not only proposing and classifying practical exercises, but also evaluating automatically the efforts dedicated and the results obtained by the students. The system has been applied to the 1st-year students at the Industrial Engineering specialization at the Universidad Politecnica de Madrid. Results show that the students obtained better academic performance, reducing the failure rate in the practical exam considerably with respect to previous years, in addition that an anonymous survey proved that students are satisfied with the system because they get instant feedback about their programs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We show a method for parallelizing top down dynamic programs in a straightforward way by a careful choice of a lock-free shared hash table implementation and randomization of the order in which the dynamic program computes its subproblems. This generic approach is applied to dynamic programs for knapsack, shortest paths, and RNA structure alignment, as well as to a state-of-the-art solution for minimizing the máximum number of open stacks. Experimental results are provided on three different modern multicore architectures which show that this parallelization is effective and reasonably scalable. In particular, we obtain over 10 times speedup for 32 threads on the open stacks problem.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El cálculo de relaciones binarias fue creado por De Morgan en 1860 para ser posteriormente desarrollado en gran medida por Peirce y Schröder. Tarski, Givant, Freyd y Scedrov demostraron que las álgebras relacionales son capaces de formalizar la lógica de primer orden, la lógica de orden superior así como la teoría de conjuntos. A partir de los resultados matemáticos de Tarski y Freyd, esta tesis desarrolla semánticas denotacionales y operacionales para la programación lógica con restricciones usando el álgebra relacional como base. La idea principal es la utilización del concepto de semántica ejecutable, semánticas cuya característica principal es el que la ejecución es posible utilizando el razonamiento estándar del universo semántico, este caso, razonamiento ecuacional. En el caso de este trabajo, se muestra que las álgebras relacionales distributivas con un operador de punto fijo capturan toda la teoría y metateoría estándar de la programación lógica con restricciones incluyendo los árboles utilizados en la búsqueda de demostraciones. La mayor parte de técnicas de optimización de programas, evaluación parcial e interpretación abstracta pueden ser llevadas a cabo utilizando las semánticas aquí presentadas. La demostración de la corrección de la implementación resulta extremadamente sencilla. En la primera parte de la tesis, un programa lógico con restricciones es traducido a un conjunto de términos relacionales. La interpretación estándar en la teoría de conjuntos de dichas relaciones coincide con la semántica estándar para CLP. Las consultas contra el programa traducido son llevadas a cabo mediante la reescritura de relaciones. Para concluir la primera parte, se demuestra la corrección y equivalencia operacional de esta nueva semántica, así como se define un algoritmo de unificación mediante la reescritura de relaciones. La segunda parte de la tesis desarrolla una semántica para la programación lógica con restricciones usando la teoría de alegorías—versión categórica del álgebra de relaciones—de Freyd. Para ello, se definen dos nuevos conceptos de Categoría Regular de Lawvere y _-Alegoría, en las cuales es posible interpretar un programa lógico. La ventaja fundamental que el enfoque categórico aporta es la definición de una máquina categórica que mejora e sistema de reescritura presentado en la primera parte. Gracias al uso de relaciones tabulares, la máquina modela la ejecución eficiente sin salir de un marco estrictamente formal. Utilizando la reescritura de diagramas, se define un algoritmo para el cálculo de pullbacks en Categorías Regulares de Lawvere. Los dominios de las tabulaciones aportan información sobre la utilización de memoria y variable libres, mientras que el estado compartido queda capturado por los diagramas. La especificación de la máquina induce la derivación formal de un juego de instrucciones eficiente. El marco categórico aporta otras importantes ventajas, como la posibilidad de incorporar tipos de datos algebraicos, funciones y otras extensiones a Prolog, a la vez que se conserva el carácter 100% declarativo de nuestra semántica. ABSTRACT The calculus of binary relations was introduced by De Morgan in 1860, to be greatly developed by Peirce and Schröder, as well as many others in the twentieth century. Using different formulations of relational structures, Tarski, Givant, Freyd, and Scedrov have shown how relation algebras can provide a variable-free way of formalizing first order logic, higher order logic and set theory, among other formal systems. Building on those mathematical results, we develop denotational and operational semantics for Constraint Logic Programming using relation algebra. The idea of executable semantics plays a fundamental role in this work, both as a philosophical and technical foundation. We call a semantics executable when program execution can be carried out using the regular theory and tools that define the semantic universe. Throughout this work, the use of pure algebraic reasoning is the basis of denotational and operational results, eliminating all the classical non-equational meta-theory associated to traditional semantics for Logic Programming. All algebraic reasoning, including execution, is performed in an algebraic way, to the point we could state that the denotational semantics of a CLP program is directly executable. Techniques like optimization, partial evaluation and abstract interpretation find a natural place in our algebraic models. Other properties, like correctness of the implementation or program transformation are easy to check, as they are carried out using instances of the general equational theory. In the first part of the work, we translate Constraint Logic Programs to binary relations in a modified version of the distributive relation algebras used by Tarski. Execution is carried out by a rewriting system. We prove adequacy and operational equivalence of the semantics. In the second part of the work, the relation algebraic approach is improved by using allegory theory, a categorical version of the algebra of relations developed by Freyd and Scedrov. The use of allegories lifts the semantics to typed relations, which capture the number of logical variables used by a predicate or program state in a declarative way. A logic program is interpreted in a _-allegory, which is in turn generated from a new notion of Regular Lawvere Category. As in the untyped case, program translation coincides with program interpretation. Thus, we develop a categorical machine directly from the semantics. The machine is based on relation composition, with a pullback calculation algorithm at its core. The algorithm is defined with the help of a notion of diagram rewriting. In this operational interpretation, types represent information about memory allocation and the execution mechanism is more efficient, thanks to the faithful representation of shared state by categorical projections. We finish the work by illustrating how the categorical semantics allows the incorporation into Prolog of constructs typical of Functional Programming, like abstract data types, and strict and lazy functions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We report on a detailed study of the application and effectiveness of program analysis based on abstract interpretation to automatic program parallelization. We study the case of parallelizing logic programs using the notion of strict independence. We first propose and prove correct a methodology for the application in the parallelization task of the information inferred by abstract interpretation, using a parametric domain. The methodology is generic in the sense of allowing the use of different analysis domains. A number of well-known approximation domains are then studied and the transformation into the parametric domain defined. The transformation directly illustrates the relevance and applicability of each abstract domain for the application. Both local and global analyzers are then built using these domains and embedded in a complete parallelizing compiler. Then, the performance of the domains in this context is assessed through a number of experiments. A comparatively wide range of aspects is studied, from the resources needed by the analyzers in terms of time and memory to the actual benefits obtained from the information inferred. Such benefits are evaluated both in terms of the characteristics of the parallelized code and of the actual speedups obtained from it. The results show that data flow analysis plays an important role in achieving efficient parallelizations, and that the cost of such analysis can be reasonable even for quite sophisticated abstract domains. Furthermore, the results also offer significant insight into the characteristics of the domains, the demands of the application, and the trade-offs involved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We propose a general framework for assertion-based debugging of constraint logic programs. Assertions are linguistic constructions which allow expressing properties of programs. We define assertion schemas which allow writing (partial) specifications for constraint logic programs using quite general properties, including user-defined programs. The framework is aimed at detecting deviations of the program behavior (symptoms) with respect to the given assertions, either at compile-time or run-time. We provide techniques for using information from global analysis both to detect at compile-time assertions which do not hold in at least one of the possible executions (i.e., static symptoms) and assertions which hold for all possible executions (i.e., statically proved assertions). We also provide program transformations which introduce tests in the program for checking at run-time those assertions whose status cannot be determined at compile-time. Both the static and the dynamic checking are provably safe in the sense that all errors flagged are definite violations of the specifications. Finally, we report on an implemented instance of the assertion language and framework.