920 resultados para Web 2.0 Applications in Enterprises


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent commentaries have proposed the advantages of using open exchange of data and informatics resources for improving health-related policies and patient care in Africa. Yet, in many African regions, both private medical and public health information systems are still unaffordable. Open exchange over the social Web 2.0 could encourage more altruistic support of medical initiatives. We have carried out some experiments to demonstrate the feasibility of using this approach to disseminate open data and informatics resources in Africa. After the experiments we developed the AFRICA BUILD Portal, the first Social Network for African biomedical researchers. Through the AFRICA BUILD Portal users can access in a transparent way to several resources. Currently, over 600 researchers are using distributed and open resources through this platform committed to low connections.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The conception of IoT (Internet of Things) is accepted as the future tendency of Internet among academia and industry. It will enable people and things to be connected at anytime and anyplace, with anything and anyone. IoT has been proposed to be applied into many areas such as Healthcare, Transportation,Logistics, and Smart environment etc. However, this thesis emphasizes on the home healthcare area as it is the potential healthcare model to solve many problems such as the limited medical resources, the increasing demands for healthcare from elderly and chronic patients which the traditional model is not capable of. A remarkable change in IoT in semantic oriented vision is that vast sensors or devices are involved which could generate enormous data. Methods to manage the data including acquiring, interpreting, processing and storing data need to be implemented. Apart from this, other abilities that IoT is not capable of are concluded, namely, interoperation, context awareness and security & privacy. Context awareness is an emerging technology to manage and take advantage of context to enable any type of system to provide personalized services. The aim of this thesis is to explore ways to facilitate context awareness in IoT. In order to realize this objective, a preliminary research is carried out in this thesis. The most basic premise to realize context awareness is to collect, model, understand, reason and make use of context. A complete literature review for the existing context modelling and context reasoning techniques is conducted. The conclusion is that the ontology-based context modelling and ontology-based context reasoning are the most promising and efficient techniques to manage context. In order to fuse ontology into IoT, a specific ontology-based context awareness framework is proposed for IoT applications. In general, the framework is composed of eight components which are hardware, UI (User Interface), Context modelling, Context fusion, Context reasoning, Context repository, Security unit and Context dissemination. Moreover, on the basis of TOVE (Toronto Virtual Enterprise), a formal ontology developing methodology is proposed and illustrated which consists of four stages: Specification & Conceptualization, Competency Formulation, Implementation and Validation & Documentation. In addition, a home healthcare scenario is elaborated by listing its well-defined functionalities. Aiming at representing this specific scenario, the proposed ontology developing methodology is applied and the ontology-based model is developed in a free and open-source ontology editor called Protégé. Finally, the accuracy and completeness of the proposed ontology are validated to show that this proposed ontology is able to accurately represent the scenario of interest.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El mundo de la web admite actualmente los productos desarrollados tanto por desarrolladores profesionales como por usuarios finales con un conocimiento más limitado. A pesar de la diferencia que se puede suponer de calidad entre los productos de ambos, las dos soluciones pueden ser reconocidas y empleadas en una aplicación. En la Web 2.0, este comportamiento se observa en el desarrollo de componentes web. Lo que se persigue en el trabajo es desarrollar un modelo de persistencia que, apoyado por un lado servidor y por uno cliente, recoja las métricas de calidad de los componentes cuando los usuarios interaccionan con ellos. A partir de estas métricas, es posible mejorar la calidad de estos componentes. La forma en la que se van a recoger las métricas es a través de PicBit, la aplicación desarrollada para que los usuarios puedan interconectar diferentes componentes entre ellos sin restricciones, de forma que tras interactuar con ellos puedan expresar su grado de satisfacción, que se recoge para la evaluación de la calidad. Se definen también unas métricas intrínsecas al componente, no determinadas por el usuario y que sirven como referencia de la evaluación. Cuando se tienen tanto las métricas intrínsecas como procedentes del usuario, se realiza una correlación entre ellas que permite analizar las posibles desviaciones entre ellas y determinar la calidad propia del componente. Las conclusiones que se pueden obtener del trabajo es que cuando los usuarios pueden realizar pruebas de usabilidad de forma libre, sin restricciones, es mayor la posibilidad de obtener resultados favorables porque estos resultados muestran cómo usará un usuario final la aplicación. Este método de trabajo se ve favorecido por el número de herramientas que se pueden utilizar hoy para monitorizar el flujo de usuario en el servicio.---ABSTRACT---Nowadays, the web world deals with products developed both by professional developers and by end-users with some limited knowledge. Although the difference between both can be important in quality terms, both are accepted and included in web applications. In web 2.0, this behavior can be recognized in the web components development. The goal pursued in the work presented is to create a persistent model that, supported by an end and a back side, will pick the quality measures of the components when the users interact with them. These measures are the starting point for improving the components. The way in which the measures are going to be picked is through PicBit, the application we have developed in order to allow the users playing with the components without restrictions or rules, so after the interaction they can give their satisfaction mark with the application. This will be the value used to evaluate the quality. Some own measures are also defined, which does not depend on the user and which will be used as a reference point of the evaluation. When the measures from users and own ones are got, their correlation is analyzed to study the differences between them and to establish the quality of the component. The conclusion that can be gained from the project is the importance of giving freedom for users when doing usability tests because it increases the chance to get positive results, in the way the users execute the operations they want with the application. This method is fortunate for having such a number of tools to monitor the user flow when using the service.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La web vive un proceso de cambio constante, basado en una interacción mayor del usuario. A partir de la actual corriente de paradigmas y tecnologías asociadas a la web 2.0, han surgido una serie de estándares de gran utilidad, que cubre la necesidad de los desarrollos actuales de la red. Entre estos se incluyen los componentes web, etiquetas HTML definidas por el usuario que cubren una función concreta dentro de una página. Existe una necesidad de medir la calidad de dichos desarrollos, para discernir si el concepto de componente web supone un cambio revolucionario en el desarrollo de la web 2.0. Para ello, es necesario realizar una explotación de componentes web, considerada como la medición de calidad basada en métricas y definición de un modelo de interconexión de componentes. La plataforma PicBit surge como respuesta a estas cuestiones. Consiste en una plataforma social de construcción de perfiles basada en estos elementos. Desde la perspectiva del usuario final se trata de una herramienta para crear perfiles y comunidades sociales, mientras que desde una perspectiva académica, la plataforma consiste en un entorno de pruebas o sandbox de componentes web. Para ello, será necesario implementar el extremo servidor de dicha plataforma, enfocado a la labor de explotación, por medio de la definición de una interfaz REST de operaciones y un sistema para la recolección de eventos de usuario en la plataforma. Gracias a esta plataforma se podrán discernir qué parámetros influyen positivamente en la experiencia de uso de un componente, así como descubrir el futuro potencial de este tipo de desarrollos.---ABSTRACT---The web evolves into a more interactive platform. From the actual version of the web, named as web 2.0, many paradigms and standards have arisen. One of those standards is web components, a set of concepts to define new HTML tags that covers a specific function inside a web page. It is necessary to measure the quality of this kind of software development, and the aim behind this approach is to determine if this new set of concepts would survive in the actual web paradigm. To achieve this, it is described a model to analyse components, in the terms of quality measure and interconnection model description. PicBit consists of a social platform to use web components. From the point of view of the final user, this platform is a tool to create social profiles using components, whereas from the point of view of technicians, it consists of a sandbox of web components. Thanks to this platform, we will be able to discover those parameters that have a positive effect in the user experience and to discover the potential of this new set of standards into the web 2.0.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta tesis establece los fundamentos teóricos y diseña una colección abierta de clases C++ denominada VBF (Vector Boolean Functions) para analizar funciones booleanas vectoriales (funciones que asocian un vector booleano a otro vector booleano) desde una perspectiva criptográfica. Esta nueva implementación emplea la librería NTL de Victor Shoup, incorporando nuevos módulos que complementan a las funciones de NTL, adecuándolas para el análisis criptográfico. La clase fundamental que representa una función booleana vectorial se puede inicializar de manera muy flexible mediante diferentes estructuras de datas tales como la Tabla de verdad, la Representación de traza y la Forma algebraica normal entre otras. De esta manera VBF permite evaluar los criterios criptográficos más relevantes de los algoritmos de cifra en bloque y de stream, así como funciones hash: por ejemplo, proporciona la no-linealidad, la distancia lineal, el grado algebraico, las estructuras lineales, la distribución de frecuencias de los valores absolutos del espectro Walsh o del espectro de autocorrelación, entre otros criterios. Adicionalmente, VBF puede llevar a cabo operaciones entre funciones booleanas vectoriales tales como la comprobación de igualdad, la composición, la inversión, la suma, la suma directa, el bricklayering (aplicación paralela de funciones booleanas vectoriales como la empleada en el algoritmo de cifra Rijndael), y la adición de funciones coordenada. La tesis también muestra el empleo de la librería VBF en dos aplicaciones prácticas. Por un lado, se han analizado las características más relevantes de los sistemas de cifra en bloque. Por otro lado, combinando VBF con algoritmos de optimización, se han diseñado funciones booleanas cuyas propiedades criptográficas son las mejores conocidas hasta la fecha. ABSTRACT This thesis develops the theoretical foundations and designs an open collection of C++ classes, called VBF, designed for analyzing vector Boolean functions (functions that map a Boolean vector to another Boolean vector) from a cryptographic perspective. This new implementation uses the NTL library from Victor Shoup, adding new modules which complement the existing ones making VBF better suited for cryptography. The fundamental class representing a vector Boolean function can be initialized in a flexible way via several alternative types of data structures such as Truth Table, Trace Representation, Algebraic Normal Form (ANF) among others. This way, VBF allows the evaluation of the most relevant cryptographic criteria for block and stream ciphers as well as for hash functions: for instance, it provides the nonlinearity, the linearity distance, the algebraic degree, the linear structures, the frequency distribution of the absolute values of the Walsh Spectrum or the Autocorrelation Spectrum, among others. In addition, VBF can perform operations such as equality testing, composition, inversion, sum, direct sum, bricklayering (parallel application of vector Boolean functions as employed in Rijndael cipher), and adding coordinate functions of two vector Boolean functions. This thesis also illustrates the use of VBF in two practical applications. On the one hand, the most relevant properties of the existing block ciphers have been analysed. On the other hand, by combining VBF with optimization algorithms, new Boolean functions have been designed which have the best known cryptographic properties up-to-date.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La tesis estudia uno de los aspectos más importantes de la gestión de la sociedad de la información: conocer la manera en que una persona valora cualquier situación. Esto es importante para el individuo que realiza la valoración y para el entorno con el que se relaciona. La valoración es el resultado de la comparación: se asignan los mismos valores a alternativas similares y mayores valores a alternativas mejor consideradas en el proceso de comparación. Los patrones que guían al individuo a la hora de hacer la comparación se derivan de sus preferencias individuales (es decir, de sus opiniones). En la tesis se presentan varios procedimientos para establecer las relaciones de preferencia entre alternativas de una persona. La valoración progresa hasta obtener una representación numérica de sus preferencias. Cuando la representación de preferencias es homogénea permite, además, contrastar las preferencias personales con las del resto de evaluadores, lo que favorece la evaluación de políticas, la transferencia de información entre diferentes individuos y el diseño de la alternativa que mejor se adapte a las preferencias identificadas. Al mismo tiempo, con esta información se pueden construir comunidades de personas con los mismos sistemas de preferencias ante una cuestión concreta. La tesis muestra un caso de aplicación de esta metodología: optimización de las políticas laborales en un mercado real. Para apoyar a los demandantes de empleo (en su iniciación o reincorporación al mundo laboral o en el cambio de su actividad) es necesario conocer sus preferencias respecto a las ocupaciones que están dispuestos a desempeñar. Además, para que la intermediación laboral sea efectiva, las ocupaciones buscadas deben de ser ofrecidas por el mercado de trabajo y el demandante debe reunir las condiciones para acceder a esas ocupaciones. El siguiente desarrollo de estos modelos nos lleva a los procedimientos utilizados para transformar múltiples preferencias en una decisión agregada y que consideran tanto la opinión de cada uno de los individuos que participan en la decisión como las interacciones sociales, todo ello dirigido a generar una solución que se ajuste lo mejor posible al punto de vista de toda la población. Las decisiones con múltiples participantes inciden, principalmente, en: el aumento del alcance para incorporar a personas que tradicionalmente no han sido consideradas en las tomas de decisiones, la agregación de las preferencias de los múltiples participantes en las tomas de decisiones colectivas (mediante votación, utilizando aplicaciones desarrolladas para la Web2.0 y a través de comparaciones interpersonales de utilidad) y, finalmente, la auto-organización para permitir que interaccionen entre si los participantes en la valoración, de forma que hagan que el resultado final sea mejor que la mera agregación de opiniones individuales. La tesis analiza los sistemas de e-democracia o herramientas para su implantación que tienen más más utilización en la actualidad o son más avanzados. Están muy relacionados con la web 2.0 y su implantación está suponiendo una evolución de la democracia actual. También se estudian aplicaciones de software de Colaboración en la toma de decisiones (Collaborative decision-making (CDM)) que ayudan a dar sentido y significado a los datos. Pretenden coordinar las funciones y características necesarias para llegar a decisiones colectivas oportunas, lo que permite a todos los interesados participar en el proceso. La tesis finaliza con la presentación de un nuevo modelo o paradigma en la toma de decisiones con múltiples participantes. El desarrollo se apoya en el cálculo de las funciones de utilidad empática. Busca la colaboración entre los individuos para que la toma de decisiones sea más efectiva, además pretende aumentar el número de personas implicadas. Estudia las interacciones y la retroalimentación entre ciudadanos, ya que la influencia de unos ciudadanos en los otros es fundamental para los procesos de toma de decisiones colectivas y de e-democracia. También incluye métodos para detectar cuando se ha estancado el proceso y se debe interrumpir. Este modelo se aplica a la consulta de los ciudadanos de un municipio sobre la oportunidad de implantar carriles-bici y las características que deben tomar. Se simula la votación e interacción entre los votantes. ABSTRACT The thesis examines one of the most important aspects of the management of the information society: get to know how a person values every situation. This is important for the individual performing the assessment and for the environment with which he interacts. The assessment is a result of the comparison: identical values are allocated to similar alternatives and higher values are assigned to those alternatives that are more favorably considered in the comparison process. Patterns that guide the individual in making the comparison are derived from his individual preferences (ie, his opinions). In the thesis several procedures to establish preference relations between alternatives a person are presented. The assessment progresses to obtain a numerical representation of his preferences. When the representation of preferences is homogeneous, it also allows the personal preferences of each individual to be compared with those of other evaluators, favoring policy evaluation, the transfer of information between different individuals and design the alternative that best suits the identified preferences. At the same time, with this information you can build communities of people with similar systems of preferences referred to a particular issue. The thesis shows a case of application of this methodology: optimization of labour policies in a real market. To be able support jobseekers (in their initiation or reinstatement to employment or when changing area of professional activity) is necessary to know their preferences for jobs that he is willing to perform. In addition, for labour mediation to be effective occupations that are sought must be offered by the labour market and the applicant must meet the conditions for access to these occupations. Further development of these models leads us to the procedures used to transform multiple preferences in an aggregate decision and consider both the views of each of the individuals involved in the decision and the social interactions, all aimed at generating a solution that best fits of the point of view of the entire population. Decisions with multiple participants mainly focus on: increasing the scope to include people who traditionally have not been considered in decision making, aggregation of the preferences of multiple participants in collective decision making (by vote, using applications developed for the Web 2.0 and through interpersonal comparisons of utility) and, finally, self-organization to allow participants to interact with each other in the assessment, so that the final result is better than the mere aggregation of individual opinions. The thesis analyzes the systems of e-democracy or tools for implementation which are more popular or more advanced. They are closely related to the Web 2.0 and its implementation is bringing an evolution of the current way of understanding democracy. We have also studied Collaborative Decision-Making (CDM)) software applications in decision-making which help to give sense and meaning to the data. They intend to coordinate the functions and features needed to reach adequate collective decisions, allowing all stakeholders to participate in the process. The thesis concludes with the presentation of a new model or paradigm in decision-making with multiple participants. The development is based on the calculation of the empathic utility functions. It seeks collaboration between individuals to make decision-making more effective; it also aims to increase the number of people involved. It studies the interactions and feedback among citizens, because the influence of some citizens in the other is fundamental to the process of collective decision-making and e-democracy. It also includes methods for detecting when the process has stalled and should be discontinued. This model is applied to the consultation of the citizens of a municipality on the opportunity to introduce bike lanes and characteristics they should have. Voting and interaction among voters is simulated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The 2.0-Å resolution x-ray crystal structure of a novel trimeric antibody fragment, a “triabody,” has been determined. The trimer is made up of polypeptides constructed in a manner identical to that previously described for some “diabodies”: a VL domain directly fused to the C terminus of a VH domain—i.e., without any linker sequence. The trimer has three Fv heads with the polypeptides arranged in a cyclic, head-to-tail fashion. For the particular structure reported here, the polypeptide was constructed with a VH domain from one antibody fused to the VL domain from an unrelated antibody giving rise to “combinatorial” Fvs upon formation of the trimer. The structure shows that the exchange of the VL domain from antibody B1-8, a Vλ domain, with the VL domain from antibody NQ11, a Vκ domain, leads to a dramatic conformational change in the VH CDR3 loop of antibody B1-8. The magnitude of this change is similar to the largest of the conformational changes observed in antibody fragments in response to antigen binding. Combinatorial pairing of VH and VL domains constitutes a major component of antibody diversity. Conformationally flexible antigen-binding sites capable of adapting to the specific CDR3 loop context created upon VH–VL pairing may be employed by the immune system to maximize the structural diversity of the immune response.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We tested whether severe congestive heart failure (CHF), a condition associated with excess free-water retention, is accompanied by altered regulation of the vasopressin-regulated water channel, aquaporin-2 (AQP2), in the renal collecting duct. CHF was induced by left coronary artery ligation. Compared with sham-operated animals, rats with CHF had severe heart failure with elevated left ventricular end-diastolic pressures (LVEDP): 26.9 ± 3.4 vs. 4.1 ± 0.3 mmHg, and reduced plasma sodium concentrations (142.2 ± 1.6 vs. 149.1 ± 1.1 mEq/liter). Quantitative immunoblotting of total kidney membrane fractions revealed a significant increase in AQP2 expression in animals with CHF (267 ± 53%, n = 12) relative to sham-operated controls (100 ± 13%, n = 14). In contrast, immunoblotting demonstrated a lack of an increase in expression of AQP1 and AQP3 water channel expression, indicating that the effect on AQP2 was selective. Furthermore, postinfarction animals without LVEDP elevation or plasma Na reduction showed no increase in AQP2 expression (121 ± 28% of sham levels, n = 6). Immunocytochemistry and immunoelectron microscopy demonstrated very abundant labeling of the apical plasma membrane and relatively little labeling of intracellular vesicles in collecting duct cells from rats with severe CHF, consistent with enhanced trafficking of AQP2 to the apical plasma membrane. The selective increase in AQP2 expression and enhanced plasma membrane targeting provide an explanation for the development of water retention and hyponatremia in severe CHF.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The levels in Sn-129 populated from the beta(-) decay of In-129 isomers were investigated at the ISOLDE facility of CERN using the newly commissioned ISOLDE Decay Station (IDS). The lowest 1/2(+) state and the 3/2(+) ground state in 129Sn are expected to have configurations dominated by the neutron s(1/2) (l = 0) and d(3/2) (l = 2) single-particle states, respectively. Consequently, these states should be connected by a somewhat slow l-forbidden M1 transition. Using fast-timing spectroscopy we havemeasured the half-life of the 1/2(+) 315.3-keV state, T-1/2 = 19(10) ps, which corresponds to a moderately fast M1 transition. Shell-model calculations using the CD-Bonn effective interaction, with standard effective charges and g factors, predict a 4-ns half-life for this level. We can reconcile the shell-model calculations to the measured T-1/2 value by the renormalization of the M1 effective operator for neutron holes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El objetivo general de este proyecto se centra en el estudio, desarrollo y experimentación de diferentes técnicas y sistemas basados en Tecnologías del Lenguaje Humano (TLH) para el desarrollo de la próxima generación de sistemas de procesamiento inteligente de la información digital (modelado, recuperación, tratamiento, comprensión y descubrimiento) afrontando los actuales retos de la comunicación digital. En este nuevo escenario, los sistemas deben incorporar capacidades de razonamiento que descubrirán la subjetividad de la información en todos sus contextos (espacial, temporal y emocional) analizando las diferentes dimensiones de uso (multilingualidad, multimodalidad y registro).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The exponential growth of the subjective information in the framework of the Web 2.0 has led to the need to create Natural Language Processing tools able to analyse and process such data for multiple practical applications. They require training on specifically annotated corpora, whose level of detail must be fine enough to capture the phenomena involved. This paper presents EmotiBlog – a fine-grained annotation scheme for subjectivity. We show the manner in which it is built and demonstrate the benefits it brings to the systems using it for training, through the experiments we carried out on opinion mining and emotion detection. We employ corpora of different textual genres –a set of annotated reported speech extracted from news articles, the set of news titles annotated with polarity and emotion from the SemEval 2007 (Task 14) and ISEAR, a corpus of real-life self-expressed emotion. We also show how the model built from the EmotiBlog annotations can be enhanced with external resources. The results demonstrate that EmotiBlog, through its structure and annotation paradigm, offers high quality training data for systems dealing both with opinion mining, as well as emotion detection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, many efforts have been made in the academic world to adapt the new degrees to the new European Higher Education Area (EHEA). New technologies have been the most important factor to carry out this adaptation. In particular, the tools 2.0 have been spreading quickly, not just the Web 2.0, but even in all the educational levels. Nevertheless, it is now necessary to evaluate whether all these efforts and all the changes, carried out in order to obtain improved academic performance among students, have provided good results. Therefore, the aim of this paper is focused on studying the impact of the implementation of information and communication technologies (ICTs) in a subject belonging to a Master from the University of Alicante in the academic year (2010-2011). In special, it is an elective course called "Advanced Visual Ergonomics" from the Master of Clinical Optometry and Vision. The methodology used to teach this course differs from the traditional one in many respects. For example, one of the resources used for the development of this course is a blog developed specifically to coordinate a series of virtual works, whose purpose is that the student goes into specific aspects of the current topic. Next, the student participates in an active role by writing a personal assessment on the blog. However, in the course planning, there is an attendance to lessons, where the teacher presents certain issues in a more traditional way, that is, with a lecture supported with audiovisual materials, such as materials generated in powerpoint. To evaluate the quality of the results achieved with this methodology, in this work the personal assessment of the students, who have completed this course during this academic year, are collected. In particular, we want to know their opinion about the used resources, as well as the followed methodology. The tool used to collect this information was a questionnaire. This questionnaire evaluates different aspects of the course: a general opinion, quality of the received information, satisfaction about the followed methodology and the student´s critical awareness. The design of this questionnaire is very important to get conclusive information about the methodology followed in the course. The questionnaire has to have an adequate number of questions; whether it has many questions, it might be boring for the student who would pay no enough attention. The questions should be well-written, with a clear structure and message, to avoid confusion and an ambiguity. The questions should be objectives, without any suggestion for a desired answer. In addition, the questionnaire should be interesting to encourage the student´ s interest. In conclusion, this questionnaire developed for this subject provided good information to evaluate whether the methodology was a useful tool to teach "Advanced Visual Ergonomics". Furthermore, the student´s opinion collected by this questionnaire might be very helpful to improve this didactic resource.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the learning experiences and opinions obtained from a group of undergraduate students in their interaction with several on-line multimedia resources included in a free on-line course about Computer Networks. These new educational resources employed are based on the Web 2.0 approach such as blogs, videos and virtual labs which have been added in a web-site for distance self-learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context: Today’s project managers have a myriad of methods to choose from for the development of software applications. However, they lack empirical data about the character of these methods in terms of usefulness, ease of use or compatibility, all of these being relevant variables to assess the developer’s intention to use them. Objective: To compare three methods, each following a different paradigm (Model-Driven, Model-Based and Code-Centric) with respect to their adoption potential by junior software developers engaged in the development of the business layer of a Web 2.0 application. Method: We have conducted a quasi-experiment with 26 graduate students of the University of Alicante. The application developed was a Social Network, which was organized around a fixed set of modules. Three of them, similar in complexity, were used for the experiment. Subjects were asked to use a different method for each module, and then to answer a questionnaire that gathered their perceptions during such use. Results: The results show that the Model-Driven method is regarded as the most useful, although it is also considered the least compatible with previous developers’ experiences. They also show that junior software developers feel comfortable with the use of models, and that they are likely to use them if the models are accompanied by a Model-Driven development environment. Conclusions: Despite their relatively low level of compatibility, Model-Driven development methods seem to show a great potential for adoption. That said, however, further experimentation is needed to make it possible to generalize the results to a different population, different methods, other languages and tools, different domains or different application sizes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Para analizar la recepción de las obras de literatura infantil y juvenil en internet, los blogs continúan siendo un espacio fundamental para la opinión crítica, la reflexión académica y la práctica docente configurándose como eje central del concepto LIJ 2.0. Después de analizar dicho concepto, se presentarán en primer lugar los principales blogs de la LIJ en castellano, donde encontramos espacios de escritores e ilustradores, propuestas de animación a la lectura, repositorios de obras y crítica literaria, centrándonos más en la parte de difusión y animación a la lectura que en la de creación literaria. Posteriormente se presentarán el recorrido didáctico realizado a lo largo de cinco años en la Universidad de Alicante para explotar dichas herramientas en distintos niveles educativos.