947 resultados para application framework


Relevância:

100.00% 100.00%

Publicador:

Resumo:

En el siguiente documento podrá encontrar de una forma clara y entendedora, a través de la creación de un sencillo aplicativo, el mecanismo para la creación de una aplicación J2EE basada en el framework de desarrollo Yakarta Struts. En el mismo partirá desde cero, desde el inicio en la captación de requerimientos, pasando por la etapa de análisis y diseño y la posterior implementación.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the open source framework MARVIN for rapid application development in the field of biomedical and clinical research. MARVIN applications consist of modules that can be plugged together in order to provide the functionality required for a specific experimental scenario. Application modules work on a common patient database that is used to store and organize medical data as well as derived data. MARVIN provides a flexible input/output system with support for many file formats including DICOM, various 2D image formats and surface mesh data. Furthermore, it implements an advanced visualization system and interfaces to a wide range of 3D tracking hardware. Since it uses only highly portable libraries, MARVIN applications run on Unix/Linux, Mac OS X and Microsoft Windows.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the medical field images obtained from high definition cameras and other medical imaging systems are an integral part of medical diagnosis. The analysis of these images are usually performed by the physicians who sometimes need to spend long hours reviewing the images before they are able to come up with a diagnosis and then decide on the course of action. In this dissertation we present a framework for a computer-aided analysis of medical imagery via the use of an expert system. While this problem has been discussed before, we will consider a system based on mobile devices. Since the release of the iPhone on April 2003, the popularity of mobile devices has increased rapidly and our lives have become more reliant on them. This popularity and the ease of development of mobile applications has now made it possible to perform on these devices many of the image analyses that previously required a personal computer. All of this has opened the door to a whole new set of possibilities and freed the physicians from their reliance on their desktop machines. The approach proposed in this dissertation aims to capitalize on these new found opportunities by providing a framework for analysis of medical images that physicians can utilize from their mobile devices thus remove their reliance on desktop computers. We also provide an expert system to aid in the analysis and advice on the selection of medical procedure. Finally, we also allow for other mobile applications to be developed by providing a generic mobile application development framework that allows for access of other applications into the mobile domain. In this dissertation we outline our work leading towards development of the proposed methodology and the remaining work needed to find a solution to the problem. In order to make this difficult problem tractable, we divide the problem into three parts: the development user interface modeling language and tooling, the creation of a game development modeling language and tooling, and the development of a generic mobile application framework. In order to make this problem more manageable, we will narrow down the initial scope to the hair transplant, and glaucoma domains.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

En aquest projecte he dissenyat i implementat un marc de treball (framework) de presentació per aplicacions J2EE. Avui en dia, per desenvolupar aplicacions web, estem obligats a que segueixin una sèrie de qualitats per tal de fer-les útils. Aquestes són eficiència, extensibilitat, reutilització i portabilitat. De no ser així, el nostre programari no tindrà lloc al mercat. Per tal d' aconseguir aquestes qualitats, cal seguir les regles establertes i utilitzar frameworks.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Este proyecto tiene como intención llevar a cabo el desarrollo de una aplicación basada en tecnologías Web utilizando Spring Framework, una infraestructura de código abierto para la plataforma Java. Se realizará primero un estudio teórico sobre las características de Spring para luego poder implementar una aplicación utilizando dicha tecnología como ejemplo práctico. La primera parte constará de un análisis sobre las características más significativas de Spring, recogiendo de esta forma información sobre todos los componentes del framework necesarios para desarrollar una aplicación genérica. El objetivo es descubrir y analizar cómo Spring facilita la implementación de un proyecto con arquitectura MVC y cómo permite integrar seguridad, internacionalización y otros conceptos de forma transparente. La segunda parte, el desarrollo de la aplicación web, sirve como demostración práctica de cómo utilizar los conocimientos recogidos sobre Spring. Se desarrollará una aplicación que gestiona un recetario generado por una comunidad de usuarios. La aplicación contiene un registro de usuarios que deberán autenticarse para poder ver sus datos personales y modificarlos si lo desean. Dependiendo del tipo de usuarios, tendrán acceso a distintas zonas de la aplicación y tendrán un rango distinto de acciones disponibles. Las acciones principales son la visualización de recetas, la creación de recetas, la modificación o eliminación de recetas propias y la modificación o eliminación de recetas de los demás usuarios. Las recetas constarán de un nombre, una descripción, una fotografía del resultado, tiempos estimados, dificultad estimada, una lista de ingredientes y sus cantidades y finalmente una serie de pasos con fotografías demostrativas si se desea añadir. Los administradores, un tipo específico de usuarios, podrán acceder a una lista de usuarios para monitorizarlos, modificarlos o añadir y quitarles permisos. ABSTRACT The purpose of this project is the development of an application based on Web technologies with the use of Spring Framework, an open-source application framework for the Java platform. A theoretical study on the characteristics of Spring will be performed first, followed by the implementation of an application using said technology to show as object lesson. The first part consists of an analysis of the most significant features of Spring, thus collecting information on all components of the framework necessary to develop a generic app. The goal is to discover and analyze how Spring helps develop a project based on a MVC architecture and how it allows seamless integration of security, internationalization and other concepts. The second part, the development of the web application, serves as a practical demonstration of how to use the knowledge gleaned about Spring. An application will be developed to manage a cookbook generated by a community of users. The application has a set of users who have to authenticate themselves to be able to see their personal data and modify it if they wish to do so. Depending on the user type, the user will be able to access different parts of the application and will have a different set of possible actions. The main possible actions are: creation recipes, modification or deletion of owned recipes and the modification and deletion of any recipe. The recipes consist its name, a description, a photograph, estimated times and difficulties, a list of ingredients along with their quantities and lastly a series of steps to follow along with demonstrative photographs if desired; and other information such as categories or difficulties. The administrators, a specific type of users, will have access to a list of users where they can monitor them, modify them or grant and remove privileges.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The goal of this project is the integration of a set of technologies (graphics, physical simulation, input), with the azm of assembling an application framework in phyton. In this research, a set of key introductory concepts are presented in adoption of a deep study of the state of the art of 3D applications. Phyton is selected an justified as the programing language due to the features and advantages that it offers in front of other languages. Finally the design and implementation of the framework is presented in the last chapter with some client application examples.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ohjelmistotestauksen avulla voidaan tarkastella sovelluksen vastaavuutta vaatimuksiin. Tavoitteena on löytää sovelluksesta virheitä, ja siten parantaa sovelluksen laatua. Sovelluksen laatu voidaan määritellä useilla mittareilla, kuten esimerkiksi testattavuudella. Tässä työssä tarkastellaan WWW-sovelluksen automatisoidun testauksen toteutusta, jossa käytetään apuna testauskehystä. Automatisoituun testaukseen kuuluu testitapausten suunnittelu sekä toteutus, joiden lopputuloksena on uudelleenajettavia testitapauksia. Testaus keskittyy sovelluksen toiminnallisuuteen ja jättää tietokantaan päivitettävien tietojen tarkastamisen tekemättä. Testaus suoritetaan ilman tarkempaa tietoa sovelluksen sisäisestä toiminnasta. Testattava sovellus on Mobilding-hankkeessa toteutettu WWW-sovellus, jonka avulla hallinnoidaan rakennuksen elementtejä. Työssä vertaillaan WWW-sovelluksen käyttöliittymän testaukseen soveltuvia testauskehyksiä, ja pyritään tuomaan esille niiden ominaispiirteitä. Työn tuloksena on uudelleensuoritettavia testitapahtumia. Lisäksi pohditaan ohjelmointikäytäntöjä, joilla voidaan edistää automatisoitua testausta. Ohjelmointikäytännöt perustuvat työn toteutuksen aikana havaittuihin ongelmiin.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tässä työssä selvitettiin hyviä tapoja ja vakiintuneita käytäntöjä pitkän käyttöiän web-sovelluksen tekemiseksi. Saatiin selville, että sovelluksen elinkaaren aikana suurin osa kustannuksista tulee ylläpidosta. Tavoitteena oli tehdä pitkään käytettävä sovellus, joten ylläpidon kustannusten osuudesta tuli saada mandollisimman pieni. Ohjelmistotuotantoprosessissa mandollisimman aikaisessa vaiheessa havaitut virheet vähentävät korjauskustannuksia oleellisesti verrattuna siihen, että virheet havaittaisiin valmiissa tuotteessa. Siksi tässä työssä tehdyssä web-sovelluksessa panostettiin prosessin alkuvaiheisiin, määrittelyyn ja suunnitteluun. Web-sovelluksen ylläpidettävyyteen ja selkeyteen vaikuttavat oleellisesti hyvät ohjelmistokehitystavat. Käyttämällä valmista sovelluskehystä ja lisäämällä toiminnallisuuksia valmiiden ohjelmistokomponenttien avulla saadaan aikaiseksi hyvien tapojen mukaisesti tehty sovellus. Tässä työssä toteutettu web-sovellus laadittiin käyttämällä sovelluskehystä ja komponenttiarkkitehtuuria. Toteutuksesta saatiin selkeä. Sovellus jaettiin loogisiin kokonaisuuksiin, jotka käsittelevät näkymiä, tietokantaa ja tietojen yhdistämistä näiden välillä. Jokainen kokonaisuus on itsenäisesti toimiva, mikä auttaa sovelluksen ylläpitämisessä ja testaamisessa.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Useimmiten www-sovellusten käyttöliittymien rakentamiseen käytetään ohjelmistokehyksiä. Www-sovelluskehys voidaan valita lukuisista valmiista vaihtoehdoista. Valinta on kuitenkin ongelmallista. Vaikka hyviä vaihtoehtoja on paljon, www-sovelluskehykset ovat parhaimmillaan vain tietyissä tilanteissa. Tämän työn tavoitteena oli löytää keino vertailla www-sovelluskehyksiä. Menetelmän haluttiin olevan kustannustehokas. Työssä hyödynnettiin kirjallisuudesta löytyviä ohjeita ohjelmistokomponenttien ja työkalujen valintaan. Ohjeiden avulla valittiin www-sovelluskehyksien vertailuun sopivia arviointikriteerejä. Arviointikriteerien määrittelyä varten tutkittiin ohjelmistokehysten ja www-sovelluskehysten vaatimuksia. Ohjeiden avulla suoritettiin myös vertailu käytännössä. Työn tuloksena syntyneitä www-sovelluskehysten vaatimuksia ja arviointikriteerejä voidaan käyttää suoraan vertailuissa alustasta riippumatta. Tulosten avulla voidaan vähentää www-sovelluskehysten valintaan tarvittavan työn määrää. Työhön sisältyneen vertailun suorittamiseen kului kohtuullinen määrä aikaa, ja sillä kyettiin tunnistamaan eroja www-sovelluskehysten välillä.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There is a growing concern in reducing greenhouse gas emissions all over the world. The U.K. has set 34% target reduction of emission before 2020 and 80% before 2050 compared to 1990 recently in Post Copenhagen Report on Climate Change. In practise, Life Cycle Cost (LCC) and Life Cycle Assessment (LCA) tools have been introduced to construction industry in order to achieve this such as. However, there is clear a disconnection between costs and environmental impacts over the life cycle of a built asset when using these two tools. Besides, the changes in Information and Communication Technologies (ICTs) lead to a change in the way information is represented, in particular, information is being fed more easily and distributed more quickly to different stakeholders by the use of tool such as the Building Information Modelling (BIM), with little consideration on incorporating LCC and LCA and their maximised usage within the BIM environment. The aim of this paper is to propose the development of a model-based LCC and LCA tool in order to provide sustainable building design decisions for clients, architects and quantity surveyors, by then an optimal investment decision can be made by studying the trade-off between costs and environmental impacts. An application framework is also proposed finally as the future work that shows how the proposed model can be incorporated into the BIM environment in practise.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

INTRODUCTION: Recent advances in medical imaging have brought post-mortem minimally invasive computed tomography (CT) guided percutaneous biopsy to public attention. AIMS: The goal of the following study was to facilitate and automate post-mortem biopsy, to suppress radiation exposure to the investigator, as may occur when tissue sampling under computer tomographic guidance, and to minimize the number of needle insertion attempts for each target for a single puncture. METHODS AND MATERIALS: Clinically approved and post-mortem tested ACN-III biopsy core needles (14 gauge x 160 mm) with an automatic pistol device (Bard Magnum, Medical Device Technologies, Denmark) were used for probe sampling. The needles were navigated in gelatine/peas phantom, ex vivo porcine model and subsequently in two human bodies using a navigation system (MEM centre/ISTB Medical Application Framework, Marvin, Bern, Switzerland) with guidance frame and a CT (Emotion 6, Siemens, Germany). RESULTS: Biopsy of all peas could be performed within a single attempt. The average distance between the inserted needle tip and the pea centre was 1.4mm (n=10; SD 0.065 mm; range 0-2.3 mm). The targets in the porcine liver were also accurately punctured. The average of the distance between the needle tip and the target was 0.5 mm (range 0-1 mm). Biopsies of brain, heart, lung, liver, pancreas, spleen, and kidney were performed on human corpses. For each target the biopsy needle was only inserted once. The examination of one body with sampling of tissue probes at the above-mentioned locations took approximately 45 min. CONCLUSIONS: Post-mortem navigated biopsy can reliably provide tissue samples from different body locations. Since the continuous update of positional data of the body and the biopsy needle is performed using optical tracking, no control CT images verifying the positional data are necessary and no radiation exposure to the investigator need be taken into account. Furthermore, the number of needle insertions for each target can be minimized to a single one with the ex vivo proven adequate accuracy and, in contrast to conventional CT guided biopsy, the insertion angle may be oblique. Navigation for minimally invasive tissue sampling is a useful addition to post-mortem CT guided biopsy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Modern e-learning systems represent a special type of web information systems. By definition, information systems are special computerized systems used to perform data operations by multiple users simultaneously. Each active user consumes an amount of hardware resources. A shortage of hardware resources can be caused by growing number of simultaneous users. Such situation can result in overall malfunctioning or slowed-down system. In order to avoid this problem, the underlying hardware system gets usually continuously upgraded. These upgrades, typically accompanied with various software updates, usually result in a temporarily increased amount of available resources. This work deals with the problem in a different way by proposing an implementation of a web e-learning system with a modified software architecture reducing resource usage of the server part to the bare minimum. In order to implement a full-scale e-learning system that could be used as a substitute to a conventional web e-learning system, a Rich Internet Application framework was used as basis. The technology allowed implementation of advanced interactivity features and provided an easy transfer of a substantial part of the application logic from server to clients. In combination with a special server application, the server part of the new system is able to run with a reasonable performance on a hardware with very limited computing resources.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Since the memristor was first built in 2008 at HP Labs, no end of devices and models have been presented. Also, new applications appear frequently. However, the integration of the device at the circuit level is not straightforward, because available models are still immature and/or suppose high computational loads, making their simulation long and cumbersome. This study assists circuit/systems designers in the integration of memristors in their applications, while aiding model developers in the validation of their proposals. We introduce the use of a memristor application framework to support the work of both the model developer and the circuit designer. First, the framework includes a library with the best-known memristor models, being easily extensible with upcoming models. Systematic modifications have been applied to these models to provide better convergence and significant simulations speedups. Second, a quick device simulator allows the study of the response of the models under different scenarios, helping the designer with the stimuli and operation time selection. Third, fine tuning of the device including parameters variations and threshold determination is also supported. Finally, SPICE/Spectre subcircuit generation is provided to ease the integration of the devices in application circuits. The framework provides the designer with total control overconvergence, computational load, and the evolution of system variables, overcoming usual problems in the integration of memristive devices.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Los Centros de Datos se encuentran actualmente en cualquier sector de la economía mundial. Están compuestos por miles de servidores, dando servicio a los usuarios de forma global, las 24 horas del día y los 365 días del año. Durante los últimos años, las aplicaciones del ámbito de la e-Ciencia, como la e-Salud o las Ciudades Inteligentes han experimentado un desarrollo muy significativo. La necesidad de manejar de forma eficiente las necesidades de cómputo de aplicaciones de nueva generación, junto con la creciente demanda de recursos en aplicaciones tradicionales, han facilitado el rápido crecimiento y la proliferación de los Centros de Datos. El principal inconveniente de este aumento de capacidad ha sido el rápido y dramático incremento del consumo energético de estas infraestructuras. En 2010, la factura eléctrica de los Centros de Datos representaba el 1.3% del consumo eléctrico mundial. Sólo en el año 2012, el consumo de potencia de los Centros de Datos creció un 63%, alcanzando los 38GW. En 2013 se estimó un crecimiento de otro 17%, hasta llegar a los 43GW. Además, los Centros de Datos son responsables de más del 2% del total de emisiones de dióxido de carbono a la atmósfera. Esta tesis doctoral se enfrenta al problema energético proponiendo técnicas proactivas y reactivas conscientes de la temperatura y de la energía, que contribuyen a tener Centros de Datos más eficientes. Este trabajo desarrolla modelos de energía y utiliza el conocimiento sobre la demanda energética de la carga de trabajo a ejecutar y de los recursos de computación y refrigeración del Centro de Datos para optimizar el consumo. Además, los Centros de Datos son considerados como un elemento crucial dentro del marco de la aplicación ejecutada, optimizando no sólo el consumo del Centro de Datos sino el consumo energético global de la aplicación. Los principales componentes del consumo en los Centros de Datos son la potencia de computación utilizada por los equipos de IT, y la refrigeración necesaria para mantener los servidores dentro de un rango de temperatura de trabajo que asegure su correcto funcionamiento. Debido a la relación cúbica entre la velocidad de los ventiladores y el consumo de los mismos, las soluciones basadas en el sobre-aprovisionamiento de aire frío al servidor generalmente tienen como resultado ineficiencias energéticas. Por otro lado, temperaturas más elevadas en el procesador llevan a un consumo de fugas mayor, debido a la relación exponencial del consumo de fugas con la temperatura. Además, las características de la carga de trabajo y las políticas de asignación de recursos tienen un impacto importante en los balances entre corriente de fugas y consumo de refrigeración. La primera gran contribución de este trabajo es el desarrollo de modelos de potencia y temperatura que permiten describes estos balances entre corriente de fugas y refrigeración; así como la propuesta de estrategias para minimizar el consumo del servidor por medio de la asignación conjunta de refrigeración y carga desde una perspectiva multivariable. Cuando escalamos a nivel del Centro de Datos, observamos un comportamiento similar en términos del balance entre corrientes de fugas y refrigeración. Conforme aumenta la temperatura de la sala, mejora la eficiencia de la refrigeración. Sin embargo, este incremente de la temperatura de sala provoca un aumento en la temperatura de la CPU y, por tanto, también del consumo de fugas. Además, la dinámica de la sala tiene un comportamiento muy desigual, no equilibrado, debido a la asignación de carga y a la heterogeneidad en el equipamiento de IT. La segunda contribución de esta tesis es la propuesta de técnicas de asigación conscientes de la temperatura y heterogeneidad que permiten optimizar conjuntamente la asignación de tareas y refrigeración a los servidores. Estas estrategias necesitan estar respaldadas por modelos flexibles, que puedan trabajar en tiempo real, para describir el sistema desde un nivel de abstracción alto. Dentro del ámbito de las aplicaciones de nueva generación, las decisiones tomadas en el nivel de aplicación pueden tener un impacto dramático en el consumo energético de niveles de abstracción menores, como por ejemplo, en el Centro de Datos. Es importante considerar las relaciones entre todos los agentes computacionales implicados en el problema, de forma que puedan cooperar para conseguir el objetivo común de reducir el coste energético global del sistema. La tercera contribución de esta tesis es el desarrollo de optimizaciones energéticas para la aplicación global por medio de la evaluación de los costes de ejecutar parte del procesado necesario en otros niveles de abstracción, que van desde los nodos hasta el Centro de Datos, por medio de técnicas de balanceo de carga. Como resumen, el trabajo presentado en esta tesis lleva a cabo contribuciones en el modelado y optimización consciente del consumo por fugas y la refrigeración de servidores; el modelado de los Centros de Datos y el desarrollo de políticas de asignación conscientes de la heterogeneidad; y desarrolla mecanismos para la optimización energética de aplicaciones de nueva generación desde varios niveles de abstracción. ABSTRACT Data centers are easily found in every sector of the worldwide economy. They consist of tens of thousands of servers, serving millions of users globally and 24-7. In the last years, e-Science applications such e-Health or Smart Cities have experienced a significant development. The need to deal efficiently with the computational needs of next-generation applications together with the increasing demand for higher resources in traditional applications has facilitated the rapid proliferation and growing of data centers. A drawback to this capacity growth has been the rapid increase of the energy consumption of these facilities. In 2010, data center electricity represented 1.3% of all the electricity use in the world. In year 2012 alone, global data center power demand grew 63% to 38GW. A further rise of 17% to 43GW was estimated in 2013. Moreover, data centers are responsible for more than 2% of total carbon dioxide emissions. This PhD Thesis addresses the energy challenge by proposing proactive and reactive thermal and energy-aware optimization techniques that contribute to place data centers on a more scalable curve. This work develops energy models and uses the knowledge about the energy demand of the workload to be executed and the computational and cooling resources available at data center to optimize energy consumption. Moreover, data centers are considered as a crucial element within their application framework, optimizing not only the energy consumption of the facility, but the global energy consumption of the application. The main contributors to the energy consumption in a data center are the computing power drawn by IT equipment and the cooling power needed to keep the servers within a certain temperature range that ensures safe operation. Because of the cubic relation of fan power with fan speed, solutions based on over-provisioning cold air into the server usually lead to inefficiencies. On the other hand, higher chip temperatures lead to higher leakage power because of the exponential dependence of leakage on temperature. Moreover, workload characteristics as well as allocation policies also have an important impact on the leakage-cooling tradeoffs. The first key contribution of this work is the development of power and temperature models that accurately describe the leakage-cooling tradeoffs at the server level, and the proposal of strategies to minimize server energy via joint cooling and workload management from a multivariate perspective. When scaling to the data center level, a similar behavior in terms of leakage-temperature tradeoffs can be observed. As room temperature raises, the efficiency of data room cooling units improves. However, as we increase room temperature, CPU temperature raises and so does leakage power. Moreover, the thermal dynamics of a data room exhibit unbalanced patterns due to both the workload allocation and the heterogeneity of computing equipment. The second main contribution is the proposal of thermal- and heterogeneity-aware workload management techniques that jointly optimize the allocation of computation and cooling to servers. These strategies need to be backed up by flexible room level models, able to work on runtime, that describe the system from a high level perspective. Within the framework of next-generation applications, decisions taken at this scope can have a dramatical impact on the energy consumption of lower abstraction levels, i.e. the data center facility. It is important to consider the relationships between all the computational agents involved in the problem, so that they can cooperate to achieve the common goal of reducing energy in the overall system. The third main contribution is the energy optimization of the overall application by evaluating the energy costs of performing part of the processing in any of the different abstraction layers, from the node to the data center, via workload management and off-loading techniques. In summary, the work presented in this PhD Thesis, makes contributions on leakage and cooling aware server modeling and optimization, data center thermal modeling and heterogeneityaware data center resource allocation, and develops mechanisms for the energy optimization for next-generation applications from a multi-layer perspective.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06