951 resultados para Electronic Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

ZnO has proven to be a multifunctional material with important nanotechnological applications. ZnO nanostructures can be grown in various forms such as nanowires, nanorods, nanobelts, nanocombs etc. In this work, ZnO nanostructures are grown in a double quartz tube configuration thermal Chemical Vapor Deposition (CVD) system. We focus on functionalized ZnO Nanostructures by controlling their structures and tuning their properties for various applications. The following topics have been investigated: 1. We have fabricated various ZnO nanostructures using a thermal CVD technique. The growth parameters were optimized and studied for different nanostructures. 2. We have studied the application of ZnO nanowires (ZnONWs) for field effect transistors (FETs). Unintentional n-type conductivity was observed in our FETs based on as-grown ZnO NWs. We have then shown for the first time that controlled incorporation of hydrogen into ZnO NWs can introduce p-type characters to the nanowires. We further found that the n-type behaviors remained, leading to the ambipolar behaviors of hydrogen incorporated ZnO NWs. Importantly, the detected p- and n- type behaviors are stable for longer than two years when devices were kept in ambient conditions. All these can be explained by an ab initio model of Zn vacancy-Hydrogen complexes, which can serve as the donor, acceptors, or green photoluminescence quencher, depend on the number of hydrogen atoms involved. 3. Next ZnONWs were tested for electron field emission. We focus on reducing the threshold field (Eth) of field emission from non-aligned ZnO NWs. As encouraged by our results on enhancing the conductivity of ZnO NWs by hydrogen annealing described in Chapter 3, we have studied the effect of hydrogen annealing for improving field emission behavior of our ZnO NWs. We found that optimally annealed ZnO NWs offered much lower threshold electric field and improved emission stability. We also studied field emission from ZnO NWs at moderate vacuum levels. We found that there exists a minimum Eth as we scale the threshold field with pressure. This behavior is explained by referring to Paschen’s law. 4. We have studied the application of ZnO nanostructures for solar energy harvesting. First, as-grown and (CdSe) ZnS QDs decorated ZnO NBs and ZnONWs were tested for photocurrent generation. All these nanostructures offered fast response time to solar radiation. The decoration of QDs decreases the stable current level produced by ZnONWs but increases that generated by NBs. It is possible that NBs offer more stable surfaces for the attachment of QDs. In addition, our results suggests that performance degradation of solar cells made by growing ZnO NWs on ITO is due to the increase in resistance of ITO after the high temperature growth process. Hydrogen annealing also improve the efficiency of the solar cells by decreasing the resistance of ITO. Due to the issues on ITO, we use Ni foil as the growth substrates. Performance of solar cells made by growing ZnO NWs on Ni foils degraded after Hydrogen annealing at both low (300 °C) and high (600 °C) temperatures since annealing passivates native defects in ZnONWs and thus reduce the absorption of visible spectra from our solar simulator. Decoration of QDs improves the efficiency of such solar cells by increasing absorption of light in the visible region. Using a better electrolyte than phosphate buffer solution (PBS) such as KI also improves the solar cell efficiency. 5. Finally, we have attempted p-type doping of ZnO NWs using various growth precursors including phosphorus pentoxide, sodium fluoride, and zinc fluoride. We have also attempted to create p-type carriers via introducing interstitial fluorine by annealing ZnO nanostructures in diluted fluorine gas. In brief, we are unable to reproduce the growth of reported p-type ZnO nanostructures. However; we have identified the window of temperature and duration of post-growth annealing of ZnO NWs in dilute fluorine gas which leads to suppression of native defects. This is the first experimental effort on post-growth annealing of ZnO NWs in dilute fluorine gas although this has been suggested by a recent theory for creating p-type semiconductors. In our experiments the defect band peak due to native defects is found to decrease by annealing at 300 °C for 10 – 30 minutes. One of the major future works will be to determine the type of charge carriers in our annealed ZnONWs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This document will demonstrate the methodology used to create an energy and conductance based model for power electronic converters. The work is intended to be a replacement for voltage and current based models which have limited applicability to the network nodal equations. Using conductance-based modeling allows direct application of load differential equations to the bus admittance matrix (Y-bus) with a unified approach. When applied directly to the Y-bus, the system becomes much easier to simulate since the state variables do not need to be transformed. The proposed transformation applies to loads, sources, and energy storage systems and is useful for DC microgrids. Transformed state models of a complete microgrid are compared to experimental results and show the models accurately reflect the system dynamic behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Telecommunications have developed at an incredible speed over the last couple of decades. The decreasing size of our phones and the increasing number of ways in which we can communicate are barely the only result of this (r)evolutionary development. The latter has indeed multiple implications. The change of paradigm for telecommunications regulation, epitomised by the processes of liberalisation and reregulation, was not sufficient to answer all regulatory questions pertinent to communications. Today, after the transition from monopoly to competition, we are faced perhaps with an even harder regulatory puzzle, since we must figure out how to regulate a sector that is as dynamic and as unpredictable as electronic communications have proven to be, and as vital and fundamental to the economy and to society at large. The present book addresses the regulatory puzzle of contemporary electronic communications and suggests the outlines of a coherent model for their regulation. The search for such a model involves essentially deliberations on the question "Can competition law do it all?", since generic competition rules are largely seen as the appropriate regulatory tool for the communications domain. The latter perception has been the gist of the 2002 reform of the European Community (EC) telecommunications regime, which envisages a withdrawal of sectoral regulation, as communications markets become effectively competitive and ultimately bestows the regulation of the sector upon competition law only. The book argues that the question of whether competition law is the appropriate tool needs to be examined not in the conventional contexts of sector specific rules versus competition rules or deregulation versus regulation but in a broader governance context. Consequently, the reader is provided with an insight into the workings and specific characteristics of the communications sector as network-bound, converging, dynamic and endowed with a special societal role and function. A thorough evaluation of the regulatory objectives in the communications environment contributes further to the comprehensive picture of the communications industry. Upon this carefully prepared basis, the book analyses the communications regulatory toolkit. It explores the interplay between sectoral communications regulation, competition rules (in particular Article 82 of the EC Treaty) and the rules of the World Trade Organization (WTO) relevant to telecommunications services. The in-depth analysis of multilevel construct of EC communications law is up-to-date and takes into account important recent developments in the EC competition law in practice, in particular in the field of refusal to supply and tying, of the reform of the EC electronic communications framework and new decisions of the WTO dispute settlement body, such as notably the Mexico-Telecommunications Services Panel Report. Upon these building elements, an assessment of the regulatory potential of the EC competition rules is made. The conclusions drawn are beyond the scope of the current situation of EC electronic communications and the applicable law and explore the possible contours of an optimal regulatory framework for modern communications. The book is of particular interest to communications and antitrust law experts, as well as policy makers, government agencies, consultancies and think-tanks active in the field. Experts on other network industries (such as electricity or postal communications) can also profit from the substantial experience gathered in the communications sector as the most advanced one in terms of liberalisation and reregulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Telecommunications have developed at an incredible speed over the last couple of decades. The decreasing size of our phones and the increasing number of ways in which we can communicate are barely the only result of this (r)evolutionary development. The latter has indeed multiple implications. The change of paradigm for telecommunications regulation, epitomised by the processes of liberalisation and reregulation, was not sufficient to answer all regulatory questions pertinent to communications. Today, after the transition from monopoly to competition, we are faced perhaps with an even harder regulatory puzzle, since we must figure out how to regulate a sector that is as dynamic and as unpredictable as electronic communications have proven to be, and as vital and fundamental to the economy and to society at large. The present book addresses the regulatory puzzle of contemporary electronic communications and suggests the outlines of a coherent model for their regulation. The search for such a model involves essentially deliberations on the question "Can competition law do it all?", since generic competition rules are largely seen as the appropriate regulatory tool for the communications domain. The latter perception has been the gist of the 2002 reform of the European Community (EC) telecommunications regime, which envisages a withdrawal of sectoral regulation, as communications markets become effectively competitive and ultimately bestows the regulation of the sector upon competition law only. The book argues that the question of whether competition law is the appropriate tool needs to be examined not in the conventional contexts of sector specific rules versus competition rules or deregulation versus regulation but in a broader governance context. Consequently, the reader is provided with an insight into the workings and specific characteristics of the communications sector as network-bound, converging, dynamic and endowed with a special societal role and function. A thorough evaluation of the regulatory objectives in the communications environment contributes further to the comprehensive picture of the communications industry. Upon this carefully prepared basis, the book analyses the communications regulatory toolkit. It explores the interplay between sectoral communications regulation, competition rules (in particular Article 82 of the EC Treaty) and the rules of the World Trade Organization (WTO) relevant to telecommunications services. The in-depth analysis of multilevel construct of EC communications law is up-to-date and takes into account important recent developments in the EC competition law in practice, in particular in the field of refusal to supply and tying, of the reform of the EC electronic communications framework and new decisions of the WTO dispute settlement body, such as notably the Mexico-Telecommunications Services Panel Report. Upon these building elements, an assessment of the regulatory potential of the EC competition rules is made. The conclusions drawn are beyond the scope of the current situation of EC electronic communications and the applicable law and explore the possible contours of an optimal regulatory framework for modern communications. The book is of particular interest to communications and antitrust law experts, as well as policy makers, government agencies, consultancies and think-tanks active in the field. Experts on other network industries (such as electricity or postal communications) can also profit from the substantial experience gathered in the communications sector as the most advanced one in terms of liberalisation and reregulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The common mantra in telecommunications regulatory fora (be it national, regional or international) now goes along the lines of 'deregulation-good; regulation-bad' and competition is said to be the ultimate answer to basically every question. A generalised dictum like this is in itself suspicious and even more so, when it refers to a sector such as telecommunications, which has a history of heavy regulation and has been the very epitome of state intervention. In the contemporary environment of vibrant communications, subcribing to a purely 'black-or-white' aproach may be, to put it mildly, unsafe. Before answering the question of appropriate regulatory model for communications markets, it is essential to figure out what goals are to be pursued in order to consider what kind of measures could bring about their attainment. In the words of Robert Bork, 'only when the issue of goals has been settled is it possible to frame a coherent body of substantive rules'. Against this backdrop, the present paper looks into the goals and objectives of telecommunications regulation, their complexity and inherent tension between commercial and public interests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Tuberculosis (TB) is the leading cause of death in South Africa. The burden of disease varies by age, with peaks in TB notification rates in the HIV-negative population at ages 0-5, 20-24, and 45-49 years. There is little variation between age groups in the rates in the HIV-positive population. The drivers of this age pattern remain unknown. METHODS We developed an age-structured simulation model of Mycobacterium tuberculosis (Mtb) transmission in Cape Town, South Africa. We considered five states of TB progression: susceptible, infected (latent TB), active TB, treated TB, and treatment default. Latently infected individuals could be re-infected; a previous Mtb infection slowed progression to active disease. We further considered three states of HIV progression: HIV negative, HIV positive, on antiretroviral therapy. To parameterize the model, we analysed treatment outcomes from the Cape Town electronic TB register, social mixing patterns from a Cape Town community and used literature estimates for other parameters. To investigate the main drivers behind the age patterns, we conducted sensitivity analyses on all parameters related to the age structure. RESULTS The model replicated the age patterns in HIV-negative TB notification rates of Cape Town in 2009. Simulated TB notification rate in HIV-negative patients was 1000/100,000 person-years (pyrs) in children aged <5 years and decreased to 51/100,000 in children 5-15 years. The peak in early adulthood occurred at 25-29 years (463/100,000 pyrs). After a subsequent decline, simulated TB notification rates gradually increased from the age of 30 years. Sensitivity analyses showed that the dip after the early adult peak was due to the protective effect of latent TB and that retreatment TB was mainly responsible for the rise in TB notification rates from the age of 30 years. CONCLUSION The protective effect of a first latent infection on subsequent infections and the faster progression in previously treated patients are the key determinants of the age-structure of TB notification rates in Cape Town.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of operations on representations of objects is well documented in the realm of spatial engineering. However, the mathematical structure and formal proof of these operational phenomena are not thoroughly explored. Other works have often focused on query-based models that seek to order classes and instances of objects in the form of semantic hierarchies or graphs. In some models, nodes of graphs represent objects and are connected by edges that represent different types of coarsening operators. This work, however, studies how the coarsening operator "simplification" can manipulate partitions of finite sets, independent from objects and their attributes. Partitions that are "simplified first have a collection of elements filtered (removed), and then the remaining partition is amalgamated (some sub-collections are unified). Simplification has many interesting mathematical properties. A finite composition of simplifications can also be accomplished with some single simplification. Also, if one partition is a simplification of the other, the simplified partition is defined to be less than the other partition according to the simp relation. This relation is shown to be a partial-order relation based on simplification. Collections of partitions can not only be proven to have a partial- order structure, but also have a lattice structure and are complete. In regard to a geographic information system (GIs), partitions related to subsets of attribute domains for objects are called views. Objects belong to different views based whether or not their attribute values lie in the underlying view domain. Given a particular view, objects with their attribute n-tuple codings contained in the view are part of the actualization set on views, and objects are labeled according to the particular subset of the view in which their coding lies. Though the scope of the work does not mainly focus on queries related directly to geographic objects, it provides verification for the existence of particular views in a system with this underlying structure. Given a finite attribute domain, one can say with mathematical certainty that different views of objects are partially ordered by simplification, and every collection of views has a greatest lower bound and least upper bound, which provides the validity for exploring queries in this regard.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A wide variety of spatial data collection efforts are ongoing throughout local, state and federal agencies, private firms and non-profit organizations. Each effort is established for a different purpose but organizations and individuals often collect and maintain the same or similar information. The United States federal government has undertaken many initiatives such as the National Spatial Data Infrastructure, the National Map and Geospatial One-Stop to reduce duplicative spatial data collection and promote the coordinated use, sharing, and dissemination of spatial data nationwide. A key premise in most of these initiatives is that no national government will be able to gather and maintain more than a small percentage of the geographic data that users want and desire. Thus, national initiatives depend typically on the cooperation of those already gathering spatial data and those using GIs to meet specific needs to help construct and maintain these spatial data infrastructures and geo-libraries for their nations (Onsrud 2001). Some of the impediments to widespread spatial data sharing are well known from directly asking GIs data producers why they are not currently involved in creating datasets that are of common or compatible formats, documenting their datasets in a standardized metadata format or making their datasets more readily available to others through Data Clearinghouses or geo-libraries. The research described in this thesis addresses the impediments to wide-scale spatial data sharing faced by GIs data producers and explores a new conceptual data-sharing approach, the Public Commons for Geospatial Data, that supports user-friendly metadata creation, open access licenses, archival services and documentation of parent lineage of the contributors and value- adders of digital spatial data sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information technology (IT) in the hospital organization is fast becoming a key asset, particularly in light of recent reform legislation in the United States calling for expanding the role of IT in our health care system. Future payment reductions to hospitals included in current health reform are based on expected improvements in hospital operating efficiency. Since over half of hospital expenses are for labor, improved efficiency in use of labor resources can be critical in meeting this challenge. Policy makers have touted the value of IT investments to improve efficiency in response to payment reductions. ^ This study was the first to directly examine the relationship between electronic health record (EHR) technology and staffing efficiency in hospitals. As the hospital has a myriad of outputs for inpatient and outpatient care, efficiency was measured using an industry standard performance metric – full time equivalent employees per adjusted occupied bed (FTE/AOB). Three hypotheses were tested in this study.^ To operationalize EHR technology adoption, we developed three constructs to model adoption, each of which was tested by separate hypotheses. The first hypothesis that a larger number of EHR applications used by a hospital would be associated with greater staffing efficiency (or lower values of FTE/AOB) was not accepted. Association between staffing efficiency and specific EHR applications was the second hypothesis tested and accepted with some applications showing significant impacts on observed values for FTE/AOB. Finally, the hypothesis that the longer an EHR application was used in a hospital would be associated with greater labor efficiency was not accepted as the model showed few statistically significant relationships to FTE/AOB performance. Generally, there does not appear a strong relationship between EHR usage and improved labor efficiency in hospitals.^ While returns on investment from EHR usage may not come from labor efficiencies, they may be better sought using measures of quality, contribution to an efficient and effective local health care system, and improved customer satisfaction through greater patient throughput.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores the potential usefulness of an AGE model with the Melitz-type trade specification to assess economic effects of technical regulations, taking the case of the EU ELV/RoHS directives as an example. Simulation experiments reveal that: (1) raising the fixed exporting cost to make sales in the EU market brings results that exports of the targeted commodities (motor vehicles and parts for ELV and electronic equipment for RoHS) to the EU from outside regions/countries expand while the domestic trade in the EU shrinks when the importer's preference for variety (PfV) is not strong; (2) if the PfV is not strong, policy changes that may bring reduction in the number of firms enable survived producers with high productivity to expand production to be large-scale mass producers fully enjoying the fruit of economies of scale; and (3) When the strength of the importer's PfV is changed from zero to unity, there is the value that totally changes simulation results and their interpretations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation, whose research has been conducted at the Group of Electronic and Microelectronic Design (GDEM) within the framework of the project Power Consumption Control in Multimedia Terminals (PCCMUTE), focuses on the development of an energy estimation model for the battery-powered embedded processor board. The main objectives and contributions of the work are summarized as follows: A model is proposed to obtain the accurate energy estimation results based on the linear correlation between the performance monitoring counters (PMCs) and energy consumption. the uniqueness of the appropriate PMCs for each different system, the modeling methodology is improved to obtain stable accuracies with slight variations among multiple scenarios and to be repeatable in other systems. It includes two steps: the former, the PMC-filter, to identify the most proper set among the available PMCs of a system and the latter, the k-fold cross validation method, to avoid the bias during the model training stage. The methodology is implemented on a commercial embedded board running the 2.6.34 Linux kernel and the PAPI, a cross-platform interface to configure and access PMCs. The results show that the methodology is able to keep a good stability in different scenarios and provide robust estimation results with the average relative error being less than 5%. Este trabajo fin de máster, cuya investigación se ha desarrollado en el Grupo de Diseño Electrónico y Microelectrónico (GDEM) en el marco del proyecto PccMuTe, se centra en el desarrollo de un modelo de estimación de energía para un sistema empotrado alimentado por batería. Los objetivos principales y las contribuciones de esta tesis se resumen como sigue: Se propone un modelo para obtener estimaciones precisas del consumo de energía de un sistema empotrado. El modelo se basa en la correlación lineal entre los valores de los contadores de prestaciones y el consumo de energía. Considerando la particularidad de los contadores de prestaciones en cada sistema, la metodología de modelado se ha mejorado para obtener precisiones estables, con ligeras variaciones entre escenarios múltiples y para replicar los resultados en diferentes sistemas. La metodología incluye dos etapas: la primera, filtrado-PMC, que consiste en identificar el conjunto más apropiado de contadores de prestaciones de entre los disponibles en un sistema y la segunda, el método de validación cruzada de K iteraciones, cuyo fin es evitar los sesgos durante la fase de entrenamiento. La metodología se implementa en un sistema empotrado que ejecuta el kernel 2.6.34 de Linux y PAPI, un interfaz multiplataforma para configurar y acceder a los contadores. Los resultados muestran que esta metodología consigue una buena estabilidad en diferentes escenarios y proporciona unos resultados robustos de estimación con un error medio relativo inferior al 5%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nitrogen sputtering yields as high as 104 atoms/ion, are obtained by irradiating N-rich-Cu3N films (N concentration: 33 ± 2 at.%) with Cu ions at energies in the range 10?42 MeV. The kinetics of N sputtering as a function of ion fluence is determined at several energies (stopping powers) for films deposited on both, glass and silicon substrates. The kinetic curves show that the amount of nitrogen release strongly increases with rising irradiation fluence up to reaching a saturation level at a low remaining nitrogen fraction (5?10%), in which no further nitrogen reduction is observed. The sputtering rate for nitrogen depletion is found to be independent of the substrate and to linearly increase with electronic stopping power (Se). A stopping power (Sth) threshold of ?3.5 keV/nm for nitrogen depletion has been estimated from extrapolation of the data. Experimental kinetic data have been analyzed within a bulk molecular recombination model. The microscopic mechanisms of the nitrogen depletion process are discussed in terms of a non-radiative exciton decay model. In particular, the estimated threshold is related to a minimum exciton density which is required to achieve efficient sputtering rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Antecedentes Europa vive una situación insostenible. Desde el 2008 se han reducido los recursos de los gobiernos a raíz de la crisis económica. El continente Europeo envejece con ritmo constante al punto que se prevé que en 2050 habrá sólo dos trabajadores por jubilado [54]. A esta situación se le añade el aumento de la incidencia de las enfermedades crónicas, relacionadas con el envejecimiento, cuyo coste puede alcanzar el 7% del PIB de un país [51]. Es necesario un cambio de paradigma. Una nueva manera de cuidar de la salud de las personas: sustentable, eficaz y preventiva más que curativa. Algunos estudios abogan por el cuidado personalizado de la salud (pHealth). En este modelo las prácticas médicas son adaptadas e individualizadas al paciente, desde la detección de los factores de riesgo hasta la personalización de los tratamientos basada en la respuesta del individuo [81]. El cuidado personalizado de la salud está asociado a menudo al uso de las tecnologías de la información y comunicación (TICs) que, con su desarrollo exponencial, ofrecen oportunidades interesantes para la mejora de la salud. El cambio de paradigma hacia el pHealth está lentamente ocurriendo, tanto en el ámbito de la investigación como en la industria, pero todavía no de manera significativa. Existen todavía muchas barreras relacionadas a la economía, a la política y la cultura. También existen barreras puramente tecnológicas, como la falta de sistemas de información interoperables [199]. A pesar de que los aspectos de interoperabilidad están evolucionando, todavía hace falta un diseño de referencia especialmente direccionado a la implementación y el despliegue en gran escala de sistemas basados en pHealth. La presente Tesis representa un intento de organizar la disciplina de la aplicación de las TICs al cuidado personalizado de la salud en un modelo de referencia, que permita la creación de plataformas de desarrollo de software para simplificar tareas comunes de desarrollo en este dominio. Preguntas de investigación RQ1 >Es posible definir un modelo, basado en técnicas de ingeniería del software, que represente el dominio del cuidado personalizado de la salud de una forma abstracta y representativa? RQ2 >Es posible construir una plataforma de desarrollo basada en este modelo? RQ3 >Esta plataforma ayuda a los desarrolladores a crear sistemas pHealth complejos e integrados? Métodos Para la descripción del modelo se adoptó el estándar ISO/IEC/IEEE 42010por ser lo suficientemente general y abstracto para el amplio enfoque de esta tesis [25]. El modelo está definido en varias partes: un modelo conceptual, expresado a través de mapas conceptuales que representan las partes interesadas (stakeholders), los artefactos y la información compartida; y escenarios y casos de uso para la descripción de sus funcionalidades. El modelo fue desarrollado de acuerdo a la información obtenida del análisis de la literatura, incluyendo 7 informes industriales y científicos, 9 estándares, 10 artículos en conferencias, 37 artículos en revistas, 25 páginas web y 5 libros. Basándose en el modelo se definieron los requisitos para la creación de la plataforma de desarrollo, enriquecidos por otros requisitos recolectados a través de una encuesta realizada a 11 ingenieros con experiencia en la rama. Para el desarrollo de la plataforma, se adoptó la metodología de integración continua [74] que permitió ejecutar tests automáticos en un servidor y también desplegar aplicaciones en una página web. En cuanto a la metodología utilizada para la validación se adoptó un marco para la formulación de teorías en la ingeniería del software [181]. Esto requiere el desarrollo de modelos y proposiciones que han de ser validados dentro de un ámbito de investigación definido, y que sirvan para guiar al investigador en la búsqueda de la evidencia necesaria para justificarla. La validación del modelo fue desarrollada mediante una encuesta online en tres rondas con un número creciente de invitados. El cuestionario fue enviado a 134 contactos y distribuido en algunos canales públicos como listas de correo y redes sociales. El objetivo era evaluar la legibilidad del modelo, su nivel de cobertura del dominio y su potencial utilidad en el diseño de sistemas derivados. El cuestionario incluía preguntas cuantitativas de tipo Likert y campos para recolección de comentarios. La plataforma de desarrollo fue validada en dos etapas. En la primera etapa se utilizó la plataforma en un experimento a pequeña escala, que consistió en una sesión de entrenamiento de 12 horas en la que 4 desarrolladores tuvieron que desarrollar algunos casos de uso y reunirse en un grupo focal para discutir su uso. La segunda etapa se realizó durante los tests de un proyecto en gran escala llamado HeartCycle [160]. En este proyecto un equipo de diseñadores y programadores desarrollaron tres aplicaciones en el campo de las enfermedades cardio-vasculares. Una de estas aplicaciones fue testeada en un ensayo clínico con pacientes reales. Al analizar el proyecto, el equipo de desarrollo se reunió en un grupo focal para identificar las ventajas y desventajas de la plataforma y su utilidad. Resultados Por lo que concierne el modelo que describe el dominio del pHealth, la parte conceptual incluye una descripción de los roles principales y las preocupaciones de los participantes, un modelo de los artefactos TIC que se usan comúnmente y un modelo para representar los datos típicos que son necesarios formalizar e intercambiar entre sistemas basados en pHealth. El modelo funcional incluye un conjunto de 18 escenarios, repartidos en: punto de vista de la persona asistida, punto de vista del cuidador, punto de vista del desarrollador, punto de vista de los proveedores de tecnologías y punto de vista de las autoridades; y un conjunto de 52 casos de uso repartidos en 6 categorías: actividades de la persona asistida, reacciones del sistema, actividades del cuidador, \engagement" del usuario, actividades del desarrollador y actividades de despliegue. Como resultado del cuestionario de validación del modelo, un total de 65 personas revisó el modelo proporcionando su nivel de acuerdo con las dimensiones evaluadas y un total de 248 comentarios sobre cómo mejorar el modelo. Los conocimientos de los participantes variaban desde la ingeniería del software (70%) hasta las especialidades médicas (15%), con declarado interés en eHealth (24%), mHealth (16%), Ambient Assisted Living (21%), medicina personalizada (5%), sistemas basados en pHealth (15%), informática médica (10%) e ingeniería biomédica (8%) con una media de 7.25_4.99 años de experiencia en estas áreas. Los resultados de la encuesta muestran que los expertos contactados consideran el modelo fácil de leer (media de 1.89_0.79 siendo 1 el valor más favorable y 5 el peor), suficientemente abstracto (1.99_0.88) y formal (2.13_0.77), con una cobertura suficiente del dominio (2.26_0.95), útil para describir el dominio (2.02_0.7) y para generar sistemas más específicos (2_0.75). Los expertos también reportan un interés parcial en utilizar el modelo en su trabajo (2.48_0.91). Gracias a sus comentarios, el modelo fue mejorado y enriquecido con conceptos que faltaban, aunque no se pudo demonstrar su mejora en las dimensiones evaluadas, dada la composición diferente de personas en las tres rondas de evaluación. Desde el modelo, se generó una plataforma de desarrollo llamada \pHealth Patient Platform (pHPP)". La plataforma desarrollada incluye librerías, herramientas de programación y desarrollo, un tutorial y una aplicación de ejemplo. Se definieron cuatro módulos principales de la arquitectura: el Data Collection Engine, que permite abstraer las fuentes de datos como sensores o servicios externos, mapeando los datos a bases de datos u ontologías, y permitiendo interacción basada en eventos; el GUI Engine, que abstrae la interfaz de usuario en un modelo de interacción basado en mensajes; y el Rule Engine, que proporciona a los desarrolladores un medio simple para programar la lógica de la aplicación en forma de reglas \if-then". Después de que la plataforma pHPP fue utilizada durante 5 años en el proyecto HeartCycle, 5 desarrolladores fueron reunidos en un grupo de discusión para analizar y evaluar la plataforma. De estas evaluaciones se concluye que la plataforma fue diseñada para encajar las necesidades de los ingenieros que trabajan en la rama, permitiendo la separación de problemas entre las distintas especialidades, y simplificando algunas tareas de desarrollo como el manejo de datos y la interacción asíncrona. A pesar de ello, se encontraron algunos defectos a causa de la inmadurez de algunas tecnologías empleadas, y la ausencia de algunas herramientas específicas para el dominio como el procesado de datos o algunos protocolos de comunicación relacionados con la salud. Dentro del proyecto HeartCycle la plataforma fue utilizada para el desarrollo de la aplicación \Guided Exercise", un sistema TIC para la rehabilitación de pacientes que han sufrido un infarto del miocardio. El sistema fue testeado en un ensayo clínico randomizado en el cual a 55 pacientes se les dio el sistema para su uso por 21 semanas. De los resultados técnicos del ensayo se puede concluir que, a pesar de algunos errores menores prontamente corregidos durante el estudio, la plataforma es estable y fiable. Conclusiones La investigación llevada a cabo en esta Tesis y los resultados obtenidos proporcionan las respuestas a las tres preguntas de investigación que motivaron este trabajo: RQ1 Se ha desarrollado un modelo para representar el dominio de los sistemas personalizados de salud. La evaluación hecha por los expertos de la rama concluye que el modelo representa el dominio con precisión y con un balance apropiado entre abstracción y detalle. RQ2 Se ha desarrollado, con éxito, una plataforma de desarrollo basada en el modelo. RQ3 Se ha demostrado que la plataforma es capaz de ayudar a los desarrolladores en la creación de software pHealth complejos. Las ventajas de la plataforma han sido demostradas en el ámbito de un proyecto de gran escala, aunque el enfoque genérico adoptado indica que la plataforma podría ofrecer beneficios también en otros contextos. Los resultados de estas evaluaciones ofrecen indicios de que, ambos, el modelo y la plataforma serán buenos candidatos para poderse convertir en una referencia para futuros desarrollos de sistemas pHealth. ABSTRACT Background Europe is living in an unsustainable situation. The economic crisis has been reducing governments' economic resources since 2008 and threatening social and health systems, while the proportion of older people in the European population continues to increase so that it is foreseen that in 2050 there will be only two workers per retiree [54]. To this situation it should be added the rise, strongly related to age, of chronic diseases the burden of which has been estimated to be up to the 7% of a country's gross domestic product [51]. There is a need for a paradigm shift, the need for a new way of caring for people's health, shifting the focus from curing conditions that have arisen to a sustainable and effective approach with the emphasis on prevention. Some advocate the adoption of personalised health care (pHealth), a model where medical practices are tailored to the patient's unique life, from the detection of risk factors to the customization of treatments based on each individual's response [81]. Personalised health is often associated to the use of Information and Communications Technology (ICT), that, with its exponential development, offers interesting opportunities for improving healthcare. The shift towards pHealth is slowly taking place, both in research and in industry, but the change is not significant yet. Many barriers still exist related to economy, politics and culture, while others are purely technological, like the lack of interoperable information systems [199]. Though interoperability aspects are evolving, there is still the need of a reference design, especially tackling implementation and large scale deployment of pHealth systems. This thesis contributes to organizing the subject of ICT systems for personalised health into a reference model that allows for the creation of software development platforms to ease common development issues in the domain. Research questions RQ1 Is it possible to define a model, based on software engineering techniques, for representing the personalised health domain in an abstract and representative way? RQ2 Is it possible to build a development platform based on this model? RQ3 Does the development platform help developers create complex integrated pHealth systems? Methods As method for describing the model, the ISO/IEC/IEEE 42010 framework [25] is adopted for its generality and high level of abstraction. The model is specified in different parts: a conceptual model, which makes use of concept maps, for representing stakeholders, artefacts and shared information, and in scenarios and use cases for the representation of the functionalities of pHealth systems. The model was derived from literature analysis, including 7 industrial and scientific reports, 9 electronic standards, 10 conference proceedings papers, 37 journal papers, 25 websites and 5 books. Based on the reference model, requirements were drawn for building the development platform enriched with a set of requirements gathered in a survey run among 11 experienced engineers. For developing the platform, the continuous integration methodology [74] was adopted which allowed to perform automatic tests on a server and also to deploy packaged releases on a web site. As a validation methodology, a theory building framework for SW engineering was adopted from [181]. The framework, chosen as a guide to find evidence for justifying the research questions, imposed the creation of theories based on models and propositions to be validated within a scope. The validation of the model was conducted as an on-line survey in three validation rounds, encompassing a growing number of participants. The survey was submitted to 134 experts of the field and on some public channels like relevant mailing lists and social networks. Its objective was to assess the model's readability, its level of coverage of the domain and its potential usefulness in the design of actual, derived systems. The questionnaires included quantitative Likert scale questions and free text inputs for comments. The development platform was validated in two scopes. As a small-scale experiment, the platform was used in a 12 hours training session where 4 developers had to perform an exercise consisting in developing a set of typical pHealth use cases At the end of the session, a focus group was held to identify benefits and drawbacks of the platform. The second validation was held as a test-case study in a large scale research project called HeartCycle the aim of which was to develop a closed-loop disease management system for heart failure and coronary heart disease patients [160]. During this project three applications were developed by a team of programmers and designers. One of these applications was tested in a clinical trial with actual patients. At the end of the project, the team was interviewed in a focus group to assess the role the platform had within the project. Results For what regards the model that describes the pHealth domain, its conceptual part includes a description of the main roles and concerns of pHealth stakeholders, a model of the ICT artefacts that are commonly adopted and a model representing the typical data that need to be formalized among pHealth systems. The functional model includes a set of 18 scenarios, divided into assisted person's view, caregiver's view, developer's view, technology and services providers' view and authority's view, and a set of 52 Use Cases grouped in 6 categories: assisted person's activities, system reactions, caregiver's activities, user engagement, developer's activities and deployer's activities. For what concerns the validation of the model, a total of 65 people participated in the online survey providing their level of agreement in all the assessed dimensions and a total of 248 comments on how to improve and complete the model. Participants' background spanned from engineering and software development (70%) to medical specialities (15%), with declared interest in the fields of eHealth (24%), mHealth (16%), Ambient Assisted Living (21%), Personalized Medicine (5%), Personal Health Systems (15%), Medical Informatics (10%) and Biomedical Engineering (8%) with an average of 7.25_4.99 years of experience in these fields. From the analysis of the answers it is possible to observe that the contacted experts considered the model easily readable (average of 1.89_0.79 being 1 the most favourable scoring and 5 the worst), sufficiently abstract (1.99_0.88) and formal (2.13_0.77) for its purpose, with a sufficient coverage of the domain (2.26_0.95), useful for describing the domain (2.02_0.7) and for generating more specific systems (2_0.75) and they reported a partial interest in using the model in their job (2.48_0.91). Thanks to their comments, the model was improved and enriched with concepts that were missing at the beginning, nonetheless it was not possible to prove an improvement among the iterations, due to the diversity of the participants in the three rounds. From the model, a development platform for the pHealth domain was generated called pHealth Patient Platform (pHPP). The platform includes a set of libraries, programming and deployment tools, a tutorial and a sample application. The main four modules of the architecture are: the Data Collection Engine, which allows abstracting sources of information like sensors or external services, mapping data to databases and ontologies, and allowing event-based interaction and filtering, the GUI Engine, which abstracts the user interface in a message-like interaction model, the Workow Engine, which allows programming the application's user interaction ows with graphical workows, and the Rule Engine, which gives developers a simple means for programming the application's logic in the form of \if-then" rules. After the 5 years experience of HeartCycle, partially programmed with pHPP, 5 developers were joined in a focus group to discuss the advantages and drawbacks of the platform. The view that emerged from the training course and the focus group was that the platform is well-suited to the needs of the engineers working in the field, it allowed the separation of concerns among the different specialities and it simplified some common development tasks like data management and asynchronous interaction. Nevertheless, some deficiencies were pointed out in terms of a lack of maturity of some technological choices, and for the absence of some domain-specific tools, e.g. for data processing or for health-related communication protocols. Within HeartCycle, the platform was used to develop part of the Guided Exercise system, a composition of ICT tools for the physical rehabilitation of patients who suffered from myocardial infarction. The system developed using the platform was tested in a randomized controlled clinical trial, in which 55 patients used the system for 21 weeks. The technical results of this trial showed that the system was stable and reliable. Some minor bugs were detected, but these were promptly corrected using the platform. This shows that the platform, as well as facilitating the development task, can be successfully used to produce reliable software. Conclusions The research work carried out in developing this thesis provides responses to the three three research questions that were the motivation for the work. RQ1 A model was developed representing the domain of personalised health systems, and the assessment of experts in the field was that it represents the domain accurately, with an appropriate balance between abstraction and detail. RQ2 A development platform based on the model was successfully developed. RQ3 The platform has been shown to assist developers create complex pHealth software. This was demonstrated within the scope of one large-scale project, but the generic approach adopted provides indications that it would offer benefits more widely. The results of these evaluations provide indications that both the model and the platform are good candidates for being a reference for future pHealth developments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

On the basis of optical characterization experiments and an eight band kp model, we have studied the effect of Sb incorporation on the electronic structure of InAs quantum dots (QDs). We have found that Sb incorporation in InAs QDs shifts the hole wave function to the center of the QD from the edges of the QD where it is otherwise pinned down by the effects of shear stress. The observed changes in the ground-state energy cannot merely be explained by a composition change upon Sb exposure but can be accounted for when the change in lateral size is taken into consideration. The Sb distribution inside the QDs produces distinctive changes in the density of states, particularly, in the separation between excitation shells. We find a 50% increase in the thermal escape activation energy compared with reference InAs quantum dots as well as an increment of the fundamental transition decay time with Sb incorporation. Furthermore, we find that Sb incorporation into quantum dots is strongly nonlinear with coverage, saturating at low doses. This suggests the existence of a solubility limit of the Sb incorporation into the quantum dots during growth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a particular knowledge acquisition tool for the construction and maintenance of the knowledge model of an intelligent system for emergency management in the field of hydrology. This tool has been developed following an innovative approach directed to end-users non familiarized in computer oriented terminology. According to this approach, the tool is conceived as a document processor specialized in a particular domain (hydrology) in such a way that the whole knowledge model is viewed by the user as an electronic document. The paper first describes the characteristics of the knowledge model of the intelligent system and summarizes the problems that we found during the development and maintenance of such type of model. Then, the paper describes the KATS tool, a software application that we have designed to help in this task to be used by users who are not experts in computer programming. Finally, the paper shows a comparison between KATS and other approaches for knowledge acquisition.