937 resultados para Best Possible Medication History (BPMH)
Resumo:
Background. Cancer cachexia is a common syndrome complex in cancer, occurring in nearly 80% of patients with advanced cancer and responsible for at least 20% of all cancer deaths. Cachexia is due to increased resting energy expenditure, increased production of inflammatory mediators, and changes in lipid and protein metabolism. Non-steroidal anti-inflammatory drugs (NSAIDs), by virtue of their anti-inflammatory properties, are possibly protective against cancer-related cachexia. Since cachexia is also associated with increased hospitalizations, this outcome may also show improvement with NSAID exposure. ^ Design. In this retrospective study, computerized records from 700 non-small cell lung cancer patients (NSCLC) were reviewed, and 487 (69.57%) were included in the final analyses. Exclusion criteria were severe chronic obstructive pulmonary disease, significant peripheral edema, class III or IV congestive heart failure, liver failure, other reasons for weight loss, or use of research or anabolic medications. Information on medication history, body weight and hospitalizations was collected from one year pre-diagnosis until three years post-diagnosis. Exposure to NSAIDs was defined if a patient had a history of being treated with NSAIDs for at least 50% of any given year in the observation period. We used t-test and chi-square tests for statistical analyses. ^ Results. Neither the proportion of patients with cachexia (p=0.27) nor the number of hospitalizations (p=0.74) differed among those with a history of NSAID use (n=92) and those without (n=395). ^ Conclusions. In this study, NSAID exposure was not significantly associated with weight loss or hospital admissions in patients with NSCLC. Further studies may be needed to confirm these observations.^
Resumo:
Razão de uma escolha: a importância do periódico O Portuguez e do seu redactor João Bernardo da Rocha Loureiro no desencadear da Revolução Liberal portuguesa de 1820, que deu início ao processo que colocou fim à monarquia absoluta e desencadeou a desconstrução do Antigo Regime em Portugal. Nesta panorâmica, e tendo como horizonte de chegada a fruição da liberdade, pode dizer-se que, enquanto sistema de significações colectivas, o imaginário do redactor Rocha Loureiro parece actuar como reacção ao poder dissolvedor da inteligência, como regulador que satisfaz a busca apaixonada de um grupo ou sociedade que persegue a sua identidade da qual a liberdade é elemento constituinte, fundamental e fundamentador. A liberdade define-se pela ausência de limitação às garantias necessárias para o harmonioso desenvolvimento do indivíduo. Antes de mais a liberdade é, na sua essência, uma ausência de limitação, sendo simultaneamente uma determinação positiva e espontânea da vontade de liberdade em que cada um procura atingir o fim racional de que está possuído. O indivíduo deseja a liberdade com vista a alcançar o melhor possível do seu ser. Na essência, a vontade de liberdade de cada um confunde-se com a vontade colectiva que encontra a sua expressão no poder político.
Resumo:
Shell chemistry of planktic foraminifera and the alkenone unsaturation index in 69 surface sediment samples in the tropical eastern Indian Ocean off West and South Indonesia were studied. Results were compared to modern hydrographic data in order to assess how modern environmental conditions are preserved in sedimentary record, and to determine the best possible proxies to reconstruct seasonality, thermal gradient and upper water column characteristics in this part of the world ocean. Our results imply that alkenone-derived temperatures record annual mean temperatures in the study area. However, this finding might be an artifact due to the temperature limitation of this proxy above 28°C. Combined study of shell stable oxygen isotope and Mg/Ca ratio of planktic foraminifera suggests that Globigerinoides ruber sensu stricto (s.s.), G. ruber sensu lato (s.l.), and G. sacculifer calcify within the mixed-layer between 20 m and 50 m, whereas Globigerina bulloides records mixed-layer conditions at ~50 m depth during boreal summer. Mean calcifications of Pulleniatina obliquiloculata, Neogloboquadrina dutertrei, and Globorotalia tumida occur at the top of the thermocline during boreal summer, at ~75 m, 75-100 m, and 100 m, respectively. Shell Mg/Ca ratios of all species show a significant correlation with temperature at their apparent calcification depths and validate the application of previously published temperature calibrations, except for G. tumida that requires a regional Mg/Ca-temperature calibration (Mg/Ca = 0.41 exp (0.068*T)). We show that the difference in Mg/Ca-temperatures of the mixed-layer species and the thermocline species, particularly between G. ruber s.s. (or s.l.) and P. obliquiloculata, can be applied to track changes in the upper water column stratification. Our results provide critical tools for reconstructing past changes in the hydrography of the study area and their relation to monsoon, El Niño-Southern Oscillation, and the Indian Ocean Dipole Mode.
Resumo:
In 2005, the International Ocean Colour Coordinating Group (IOCCG) convened a working group to examine the state of the art in ocean colour data merging, which showed that the research techniques had matured sufficiently for creating long multi-sensor datasets (IOCCG, 2007). As a result, ESA initiated and funded the DUE GlobColour project (http://www.globcolour.info/) to develop a satellite based ocean colour data set to support global carbon-cycle research. It aims to satisfy the scientific requirement for a long (10+ year) time-series of consistently calibrated global ocean colour information with the best possible spatial coverage. This has been achieved by merging data from the three most capable sensors: SeaWiFS on GeoEye's Orbview-2 mission, MODIS on NASA's Aqua mission and MERIS on ESA's ENVISAT mission. In setting up the GlobColour project, three user organisations were invited to help. Their roles are to specify the detailed user requirements, act as a channel to the broader end user community and to provide feedback and assessment of the results. The International Ocean Carbon Coordination Project (IOCCP) based at UNESCO in Paris provides direct access to the carbon cycle modelling community's requirements and to the modellers themselves who will use the final products. The UK Met Office's National Centre for Ocean Forecasting (NCOF) in Exeter, UK, provides an understanding of the requirements of oceanography users, and the IOCCG bring their understanding of the global user needs and valuable advice on best practice within the ocean colour science community. The three year project kicked-off in November 2005 under the leadership of ACRI-ST (France). The first year was a feasibility demonstration phase that was successfully concluded at a user consultation workshop organised by the Laboratoire d'Océanographie de Villefranche, France, in December 2006. Error statistics and inter-sensor biases were quantified by comparison with insitu measurements from moored optical buoys and ship based campaigns, and used as an input to the merging. The second year was dedicated to the production of the time series. In total, more than 25 Tb of input (level 2) data have been ingested and 14 Tb of intermediate and output products created, with 4 Tb of data distributed to the user community. Quality control (QC) is provided through the Diagnostic Data Sets (DDS), which are extracted sub-areas covering locations of in-situ data collection or interesting oceanographic phenomena. This Full Product Set (FPS) covers global daily merged ocean colour products in the time period 1997-2006 and is also freely available for use by the worldwide science community at http://www.globcolour.info/data_access_full_prod_set.html. The GlobColour service distributes global daily, 8-day and monthly data sets at 4.6 km resolution for, chlorophyll-a concentration, normalised water-leaving radiances (412, 443, 490, 510, 531, 555 and 620 nm, 670, 681 and 709 nm), diffuse attenuation coefficient, coloured dissolved and detrital organic materials, total suspended matter or particulate backscattering coefficient, turbidity index, cloud fraction and quality indicators. Error statistics from the initial sensor characterisation are used as an input to the merging methods and propagate through the merging process to provide error estimates for the output merged products. These error estimates are a key component of GlobColour as they are invaluable to the users; particularly the modellers who need them in order to assimilate the ocean colour data into ocean simulations. An intensive phase of validation has been undertaken to assess the quality of the data set. In addition, inter-comparisons between the different merged datasets will help in further refining the techniques used. Both the final products and the quality assessment were presented at a second user consultation in Oslo on 20-22 November 2007 organised by the Norwegian Institute for Water Research (NIVA); presentations are available on the GlobColour WWW site. On request of the ESA Technical Officer for the GlobColour project, the FPS data set was mirrored in the PANGAEA data library.
Resumo:
We construct an empirically informed computational model of fiscal federalism, testing whether horizontal or vertical equalization can solve the fiscal externality problem in an environment in which heterogeneous agents can move and vote. The model expands on the literature by considering the case of progressive local taxation. Although the consequences of progressive taxation under fiscal federalism are well understood, they have not been studied in a context with tax equalization, despite widespread implementation. The model also expands on the literature by comparing the standard median voter model with a realistic alternative voting mechanism. We find that fiscal federalism with progressive taxation naturally leads to segregation as well as inefficient and inequitable public goods provision while the alternative voting mechanism generates more efficient, though less equitable, public goods provision. Equalization policy, under both types of voting, is largely undermined by micro-actors' choices. For this reason, the model also does not find the anticipated effects of vertical equalization discouraging public goods spending among wealthy jurisdictions and horizontal encouraging it among poor jurisdictions. Finally, we identify two optimal scenarios, superior to both complete centralization and complete devolution. These scenarios are not only Pareto optimal, but also conform to a Rawlsian view of justice, offering the best possible outcome for the worst-off. Despite offering the best possible outcomes, both scenarios still entail significant economic segregation and inequitable public goods provision. Under the optimal scenarios agents shift the bulk of revenue collection to the federal government, with few jurisdictions maintaining a small local tax.
Resumo:
On the orbiter of the Rosetta spacecraft, the Cometary Secondary Ion Mass Analyser (COSIMA) will provide new in situ insights about the chemical composition of cometary grains all along 67P/Churyumov–Gerasimenko (67P/CG) journey until the end of December 2015 nominally. The aim of this paper is to present the pre-calibration which has already been performed as well as the different methods which have been developed in order to facilitate the interpretation of the COSIMA mass spectra and more especially of their organic content. The first step was to establish a mass spectra library in positive and negative ion mode of targeted molecules and to determine the specific features of each compound and chemical family analyzed. As the exact nature of the refractory cometary organic matter is nowadays unknown, this library is obviously not exhaustive. Therefore this library has also been the starting point for the research of indicators, which enable to highlight the presence of compounds containing specific atom or structure. These indicators correspond to the intensity ratio of specific peaks in the mass spectrum. They have allowed us to identify sample containing nitrogen atom, aliphatic chains or those containing polyaromatic hydrocarbons. From these indicators, a preliminary calibration line, from which the N/C ratio could be derived, has also been established. The research of specific mass difference could also be helpful to identify peaks related to quasi-molecular ions in an unknown mass spectrum. The Bayesian Positive Source Separation (BPSS) technique will also be very helpful for data analysis. This work is the starting point for the analysis of the cometary refractory organic matter. Nevertheless, calibration work will continue in order to reach the best possible interpretation of the COSIMA observations.
Resumo:
The main objective of ventilation systems in case of fire is the reduction of the possible consequences by achieving the best possible conditions for the evacuation of the users and the intervention of the emergency services. In the last years, the required quick response of the ventilation system, from normal to emergency mode, has been improved by the use of automatic and semi-automatic control systems, what reduces the response times through the support to the operators decision taking, and the use of pre-defined strategies. A further step consists on the use of closedloop algorithms, which takes into account not only the initial conditions but their development (air velocity, traffic situation, etc), optimizing the quality of the smoke control process
Resumo:
In the year 1999 approves the Law of Construction Building (LOE, in Spanish) to regulate a sector such as construction, which contained some shortcomings from the legal point of view. Currently, the LOE has been in force 12 years, changing the spanish world of the construction, due to influenced by internationalization. Within the LOE, there regulating the different actors involved in the construction building, as the Projects design, the Director of Construction, the developer, The builder, Director of execution of the construction (actor only in Spain, similar as construcion engineer and abroad in), control entities and the users, but lacks figure Project manager will assume the delegation of the promoter helping and you organize, direct and management the process. This figure assumes that the market and contracts are not legally regulated in Spain, then should define and establish its regulation in the LOE. (Spain Construction Law) The translation in spanish of the words "Project Manager is owed to Professor Rafael de Heredia in his book Integrated Project Management, as agent acting on behalf of the organization and promoter assuming control of the project, ie Integraded Project Management . Already exist in Spain, AEDIP (Spanish Association Integrated of Project Construction management) which comprises the major companies in “Project Management” in Spain, and MeDIP (Master in Integrated Construction Project) the largest and most advanced studies at the Polytechnic University of Madrid, in "Construction Project Management" they teach which is also in Argentina. The Integrated Project ("Project Management") applied to the construction process is a methodological technique that helps to organize, control and manage the resources of the promoters in the building process. When resources are limited (which is usually most situations) to manage them efficiently becomes very important. Well, we find that in this situation, the resources are not only limited, but it is limited, so a comprehensive control and monitoring of them becomes not only important if not crucial. The alternative of starting from scratch with a team that specializes in developing these follow directly intervening to ensure that scarce resources are used in the best possible way requires the use of a specific methodology (Manual DIP, Matrix Foreign EDR breakdown structure EDP Project, Risk Management and Control, Design Management, et ..), that is the methodology used by "Projects managers" to ensure that the initial objectives of the promoters or investors are met and all actors in process, from design to construction company have the mind aim of the project will do, trying to get their interests do not prevail over the interests of the project. Among the agents listed in the building process, "Project Management" or DIPE (Director Comprehensive building process, a proposed name for possible incorporation into the LOE, ) currently not listed as such in the LOE (Act on Construction Planning ), one of the agents that exist within the building process is not regulated from the legal point of view, no obligations, ie, as is required by law to have a project, a builder, a construction management, etc. DIPE only one who wants to hire you as have been advanced knowledge of their services by the clients they have been hiring these agents, there being no legal obligation as mentioned above, then the market is dictating its ruling on this new figure, as if it were necessary, he was not hired and eventually disappeared from the building process. As the aim of this article is regular the process and implement the name of DIPE in the Spanish Law of buildings construction (LOE)
Resumo:
Performance studies of actual parallel systems usually tend to concéntrate on the effectiveness of a given implementation. This is often done in the absolute, without quantitave reference to the potential parallelism contained in the programs from the point of view of the execution paradigm. We feel that studying the parallelism inherent to the programs is interesting, as it gives information about the best possible behavior of any implementation and thus allows contrasting the results obtained. We propose a method for obtaining ideal speedups for programs through a combination of sequential or parallel execution and simulation, and the algorithms that allow implementing the method. Our approach is novel and, we argüe, more accurate than previously proposed methods, in that a crucial part of the data - the execution times of tasks - is obtained from actual executions, while speedup is computed by simulation. This allows obtaining speedup (and other) data under controlled and ideal assumptions regarding issues such as number of processor, scheduling algorithm and overheads, etc. The results obtained can be used for example to evalúate the ideal parallelism that a program contains for a given model of execution and to compare such "perfect" parallelism to that obtained by a given implementation of that model. We also present a tool, IDRA, which implements the proposed method, and results obtained with IDRA for benchmark programs, which are then compared with those obtained in actual executions on real parallel systems.
Resumo:
The main objective of ventilation systems in case of fire is the reduction of the possible consequences by achieving the best possible conditions for the evacuation of the users and the intervention of the emergency services. The required immediate transition, from normal to emergency functioning of the ventilation equipments, is being strengthened by the use of automatic and semi-automatic control systems, what reduces the response times through the help to the operators, and the use of pre-defined strategies. A further step consists on the use of closed-loop algorithms, which takes into account not only the initial conditions but their development (air velocity, traffic situation, etc.), optimizing smoke control capacity.
Resumo:
Europe needs to restructure its energy system. The aim to decrease the reliance on fossil fuels to a higher dependence on renewable energy has now been imposed by The European Commission. In order to achieve this goal there is a great interest in Norway to become "The Green Battery of Europe". In the pursuit of this goal a GIS-tool was created to investigate the pump storage potential in Norway. The tool searches for possible connections between existing reservoirs and dams with the criteria selected by the user. The aim of this thesis was to test the tool and see if the results suggested were plausible, develop a cost calculation method for the PSH lines, and make suggestions for further development of the tool. During the process the tool presented many non-feasible pumped storage hydropower (PSH) connections. The area of Telemark was chosen for the more detailed study. The results were discussed and some improvements were suggested for further development of the tool. Also a sensitivity test was done to see which of the parameters set by the user are the most relevant for the PSH connection suggestion. From a range of the most promising PSH plants suggested by the tool, the one between Songavatn and Totak was chosen for a case study, where there already exists a power plant between both reservoirs. A new Pumped Storage Plant was designed with a power production of 1200 MW. There are still many topics open to discussion, such as how to deal with environmental restrictions, or how to deal with inflows and outflows of the reservoirs from the existing power plants. Consequently the GIS-tool can be a very useful tool to establish the best possible connections between existing reservoirs and dams, but it still needs a deep study and the creation of new parameters for the user.
Resumo:
El Proyecto Fin de Carrera realizado aborda un estudio teórico acerca de la retransmisión de un Real Madrid – F.C. Barcelona. Con este proyecto se intenta que el lector consiga tener una idea acerca de todo lo que con lleva un partido de fútbol con estas dimensiones desde el punto de vista audiovisual y sea capaz de entender los pasos necesarios a dar para realizarlo. Cuando vemos la retransmisión de un evento deportivo y concretamente de un partido de fútbol de tal envergadura, es casi imposible pensar el despliegue que hay detrás de él. Por ello, se ha intentado explicar de una manera sencilla y breve la manera de realizar un evento de este tamaño, que podría servir como ejemplo para realizar otros eventos deportivos de gran escala. A lo largo de este proyecto, se realiza un estudio completo sobre los principales pasos a dar para hacer posible que la retransmisión llegue a los espectadores. La memoria de este proyecto está basada en 7 capítulos. En el primer capítulo, se expone una breve introducción sobre la retransmisión de partidos, para que el lector pueda hacerse una idea de lo que se va a realizar posteriormente y pueda tener una idea de lo que se explica en los capítulos restantes. En el segundo capítulo, se trata del primer paso para la retransmisión de un partido de fútbol, que puede aplicarse a otros eventos deportivos. Este apartado está centrado en la localización del lugar, en él se explican los primeros pasos a dar en los primeros días de montaje. Estos son fundamentales para que posteriormente el partido pueda salir de la mejor manera posible, equivocarse o cometer errores al inicio puede acarrear mayores gastos económicos y grandes demoras de tiempo posteriormente. El tercer capítulo se centra en el montaje y la producción del evento. En la primera parte, se explica cómo situar dentro del campo de fútbol los micrófonos y las cámaras. Además, se hace una descripción de cada uno de ellos. También se introducen conceptos básicos y parámetros de los principales micrófonos y cámaras que se usarán en el evento. La segunda parte del capítulo se centra en explicar las diferentes señales utilizadas, como se transmiten y la comunicación interna y externa. El capítulo cuarto sirve para conocer el material necesario para realizar el evento. Se explican: micrófonos, cámaras, EVS (Unidades de grabación), CCU (Camera Control Unit), mezclador y tipo de cableado indicando los modelos y marcas más relevantes que se usan en la actualidad. Además, se pueden ver diferentes figuras del material utilizado. En el quinto capítulo, se ven las principales funciones que realizan los empleados. Comienza con una amplia explicación de la realización del evento y continúa explicando las diferentes funciones de los operadores del material visto en el capítulo anterior. El capítulo sexto, sirve para explicar un presupuesto aproximado de lo que sería la realización y producción del evento y poder estudiar la viabilidad de este. Por último, en el capítulo séptimo se ven una serie de conclusiones a modo de resumen, las cuales han de ayudar a dejar completamente claros una serie de conceptos básicos acerca del proyecto. ABSTRACT. The Thesis made deals with a theoretical study of the broadcast of a Real Madrid - FC Barcelona. This project, tries that the reader gets has an idea of everything that has a football match with these dimensions from the visual point of view and be able to understand the steps to take. When we see the broadcast of a sport event and specifically a football match of this magnitude, it is almost impossible to think the deployment behind it. Therefore, we have tried to explain in a simple and concise way to hold an event of this size and it could serve for other large-scale sporting events. Throughout this project, a comprehensive study is done on the main steps to be taken to make the broadcast possible way to reach spectators. The memory of this project is based on seven chapters. In the first chapter, a brief introduction explains retransmission matches, so that the reader can get an idea about is explained in the next chapters. In the second chapter, the first step is performed to broadcast a football match and that can be applied to other sports events. This section focuses on the location of where it explains the first days of installation. This is important for later the match can be done of the best possible way and wrong or make mistakes at the beginning can lead to higher economic costs and long delays of time later. The third chapter focuses on the assembly and production of the event. The first part explains how to locate within the football field microphones and cameras. It also explains each one. Also, introduces basic concepts and parameters of the main microphones and cameras that will be used at the event. In the second part, the chapter focuses on explaining the different signals used as transmission and communication internally and externally. The fourth chapter serves to meet the necessary material for the event. It explains: microphones, cameras, EVS, CCU, mixer and cabling type indicating the most relevant models and brands that are used today. Also, you can see different figures on the material used. In the fifth chapter, the main functions are performed by employees. It begins with a thorough explanation of the event and goes on to explain the various functions of the operators of the material seen in the previous chapter. The sixth chapter, helps explain an estimate of what would be the creation and production of the event and to study the feasibility of this. Finally, in the seventh chapter are a number of conclusions in summary, which should help to make thoroughly clear a number of basic concepts about the project.
Resumo:
Antecedentes: Esta investigación se enmarca principalmente en la replicación y secundariamente en la síntesis de experimentos en Ingeniería de Software (IS). Para poder replicar, es necesario disponer de todos los detalles del experimento original. Sin embargo, la descripción de los experimentos es habitualmente incompleta debido a la existencia de conocimiento tácito y a la existencia de otros problemas tales como: La carencia de un formato estándar de reporte, la inexistencia de herramientas que den soporte a la generación de reportes experimentales, etc. Esto provoca que no se pueda reproducir fielmente el experimento original. Esta problemática limita considerablemente la capacidad de los experimentadores para llevar a cabo replicaciones y por ende síntesis de experimentos. Objetivo: La investigación tiene como objetivo formalizar el proceso experimental en IS, de modo que facilite la comunicación de información entre experimentadores. Contexto: El presente trabajo de tesis doctoral ha sido desarrollado en el seno del Grupo de Investigación en Ingeniería del Software Empírica (GrISE) perteneciente a la Escuela Técnica Superior de Ingenieros Informáticos (ETSIINF) de la Universidad Politécnica de Madrid (UPM), como parte del proyecto TIN2011-23216 denominado “Tecnologías para la Replicación y Síntesis de Experimentos en Ingeniería de Software”, el cual es financiado por el Gobierno de España. El grupo GrISE cumple a la perfección con los requisitos necesarios (familia de experimentos establecida, con al menos tres líneas experimentales y una amplia experiencia en replicaciones (16 replicaciones hasta 2011 en la línea de técnicas de pruebas de software)) y ofrece las condiciones para que la investigación se lleve a cabo de la mejor manera, como por ejemplo, el acceso total a su información. Método de Investigación: Para cumplir este objetivo se opta por Action Research (AR) como el método de investigación más adecuado a las características de la investigación, para obtener resultados a través de aproximaciones sucesivas que abordan los problemas concretos de comunicación entre experimentadores. Resultados: Se formalizó el modelo conceptual del ciclo experimental desde la perspectiva de los 3 roles principales que representan los experimentadores en el proceso experimental, siendo estos: Gestor de la Investigación (GI), Gestor del Experimento (GE) y Experimentador Senior (ES). Por otra parte, se formalizó el modelo del ciclo experimental, a través de: Un workflow del ciclo y un diagrama de procesos. Paralelamente a la formalización del proceso experimental en IS, se desarrolló ISRE (de las siglas en inglés Infrastructure for Sharing and Replicating Experiments), una prueba de concepto de entorno de soporte a la experimentación en IS. Finalmente, se plantearon guías para el desarrollo de entornos de soporte a la experimentación en IS, en base al estudio de las características principales y comunes de los modelos de las herramientas de soporte a la experimentación en distintas disciplinas experimentales. Conclusiones: La principal contribución de la investigación esta representada por la formalización del proceso experimental en IS. Los modelos que representan la formalización del ciclo experimental, así como la herramienta ISRE, construida a modo de evaluación de los modelos, fueron encontrados satisfactorios por los experimentadores del GrISE. Para consolidar la validez de la formalización, consideramos que este estudio debería ser replicado en otros grupos de investigación representativos en la comunidad de la IS experimental. Futuras Líneas de Investigación: El cumplimiento de los objetivos, de la mano con los hallazgos alcanzados, han dado paso a nuevas líneas de investigación, las cuales son las siguientes: (1) Considerar la construcción de un mecanismo para facilitar el proceso de hacer explícito el conocimiento tácito de los experimentadores por si mismos de forma colaborativa y basados en el debate y el consenso , (2) Continuar la investigación empírica en el mismo grupo de investigación hasta cubrir completamente el ciclo experimental (por ejemplo: experimentos nuevos, síntesis de resultados, etc.), (3) Replicar el proceso de investigación en otros grupos de investigación en ISE, y (4) Renovar la tecnología de la prueba de concepto, tal que responda a las restricciones y necesidades de un entorno real de investigación. ABSTRACT Background: This research addresses first and foremost the replication and also the synthesis of software engineering (SE) experiments. Replication is impossible without access to all the details of the original experiment. But the description of experiments is usually incomplete because knowledge is tacit, there is no standard reporting format or there are hardly any tools to support the generation of experimental reports, etc. This means that the original experiment cannot be reproduced exactly. These issues place considerable constraints on experimenters’ options for carrying out replications and ultimately synthesizing experiments. Aim: The aim of the research is to formalize the SE experimental process in order to facilitate information communication among experimenters. Context: This PhD research was developed within the empirical software engineering research group (GrISE) at the Universidad Politécnica de Madrid (UPM)’s School of Computer Engineering (ETSIINF) as part of project TIN2011-23216 entitled “Technologies for Software Engineering Experiment Replication and Synthesis”, which was funded by the Spanish Government. The GrISE research group fulfils all the requirements (established family of experiments with at least three experimental lines and lengthy replication experience (16 replications prior to 2011 in the software testing techniques line)) and provides favourable conditions for the research to be conducted in the best possible way, like, for example, full access to information. Research Method: We opted for action research (AR) as the research method best suited to the characteristics of the investigation. Results were generated successive rounds of AR addressing specific communication problems among experimenters. Results: The conceptual model of the experimental cycle was formalized from the viewpoint of three key roles representing experimenters in the experimental process. They were: research manager, experiment manager and senior experimenter. The model of the experimental cycle was formalized by means of a workflow and a process diagram. In tandem with the formalization of the SE experimental process, infrastructure for sharing and replicating experiments (ISRE) was developed. ISRE is a proof of concept of a SE experimentation support environment. Finally, guidelines for developing SE experimentation support environments were designed based on the study of the key features that the models of experimentation support tools for different experimental disciplines had in common. Conclusions: The key contribution of this research is the formalization of the SE experimental process. GrISE experimenters were satisfied with both the models representing the formalization of the experimental cycle and the ISRE tool built in order to evaluate the models. In order to further validate the formalization, this study should be replicated at other research groups representative of the experimental SE community. Future Research Lines: The achievement of the aims and the resulting findings have led to new research lines, which are as follows: (1) assess the feasibility of building a mechanism to help experimenters collaboratively specify tacit knowledge based on debate and consensus, (2) continue empirical research at the same research group in order to cover the remainder of the experimental cycle (for example, new experiments, results synthesis, etc.), (3) replicate the research process at other ESE research groups, and (4) update the tools of the proof of concept in order to meet the constraints and needs of a real research environment.
Resumo:
Este proyecto consiste en crear una serie de tres pequeños videojuegos incluidos en una sola aplicación, para plataformas móviles Android, que permitan en cualquier lugar entrenar la estética de la voz del paciente con problemas de fonación. Dependiendo de los aspectos de la voz (sonidos sonoros y sordos, el pitch y la intensidad) a trabajar se le asignará un ejercicio u otro. En primer lugar se introduce el concepto de rehabilitación de la voz y en qué casos es necesario. Seguidamente se realiza un trabajo de búsqueda en el que se identifican las distintas plataformas de desarrollo de videojuegos que son compatibles con los sistemas Android, así como para la captura de audio y las librerías de procesado de señal. A continuación se eligen las herramientas que presentan las mejores capacidades y con las que se va a trabajar. Estas son el motor de juego Andengine, para la parte gráfica, el entorno Java específico de Android, para la captura de muestras de audio y la librería JTransforms que realiza transformadas de Fourier permitiendo procesar el audio para la detección de pitch. Al desarrollar y ensamblar los distintos bloques se prioriza el funcionamiento en tiempo real de la aplicación. Las líneas de mejora y conclusiones se comentan en el último capítulo del trabajo así como el manual de usuario para mayor comprensión. ABSTRACT. The main aim of this project is to create an application for mobile devices which includes three small speech therapy videogames for the Android OS. These videogames allow patients to train certain voice parameters (such as voice and unvoiced sounds, pitch and intensity) wherever they want and need to. First, an overview of the concept of voice rehabilitation and its uses for patients with speech disorders is given. Secondly a study has been made to identify the most suitable video game engine for the Android OS, the best possible way to capture audio from the device and the audio processing library which will combine with the latter. Therefore, the chosen tools are exposed. Andengine has been selected regarding the game engine, Android’s Java framework for audio capture and the fast Fourier transform library, JTransforms, for pitch detection. Real time processing is vital for the proper functioning of the application. Lines of improvement and other conclusions are discussed in the last part of this dissertation paper.
Resumo:
El esquema actual que existe en el ámbito de la normalización y el diseño de nuevos estándares de codificación de vídeo se está convirtiendo en una tarea difícil de satisfacer la evolución y dinamismo de la comunidad de codificación de vídeo. El problema estaba centrado principalmente en poder explotar todas las características y similitudes entre los diferentes códecs y estándares de codificación. Esto ha obligado a tener que rediseñar algunas partes comunes a varios estándares de codificación. Este problema originó la aparición de una nueva iniciativa de normalización dentro del comité ISO/IEC MPEG, llamado Reconfigurable Video Coding (RVC). Su principal idea era desarrollar un estándar de codificación de vídeo que actualizase e incrementase progresivamente una biblioteca de los componentes, aportando flexibilidad y la capacidad de tener un código reconfigurable mediante el uso de un nuevo lenguaje orientado a flujo de Actores/datos denominado CAL. Este lenguaje se usa para la especificación de la biblioteca estándar y para la creación de instancias del modelo del decodificador. Más tarde, se desarrolló un nuevo estándar de codificación de vídeo denominado High Efficiency Video Coding (HEVC), que actualmente se encuentra en continuo proceso de actualización y desarrollo, que mejorase la eficiencia y compresión de la codificación de vídeo. Obviamente se ha desarrollado una visión de HEVC empleando la metodología de RVC. En este PFC, se emplean diferentes implementaciones de estándares empleando RVC. Por ejemplo mediante los decodificadores Mpeg 4 Part 2 SP y Mpeg 4 Part 10 CBP y PHP así como del nuevo estándar de codificación HEVC, resaltando las características y utilidad de cada uno de ellos. En RVC los algoritmos se describen mediante una clase de actores que intercambian flujos de datos (tokens) para realizar diferentes acciones. El objetivo de este proyecto es desarrollar un programa que, partiendo de los decodificadores anteriormente mencionados, una serie de secuencia de vídeo en diferentes formatos de compresión y una distribución estándar de los actores (para cada uno de los decodificadores), sea capaz de generar diferentes distribuciones de los actores del decodificador sobre uno o varios procesadores del sistema sobre el que se ejecuta, para conseguir la mayor eficiencia en la codificación del vídeo. La finalidad del programa desarrollado en este proyecto es la de facilitar la realización de las distribuciones de los actores sobre los núcleos del sistema, y obtener las mejores configuraciones posibles de una manera automática y eficiente. ABSTRACT. The current scheme that exists in the field of standardization and the design of new video coding standards is becoming a difficult task to meet the evolving and dynamic community of video encoding. The problem was centered mainly in order to exploit all the features and similarities between different codecs and encoding standards. This has forced redesigning some parts common to several coding standards. This problem led to the emergence of a new initiative for standardization within the ISO / IEC MPEG committee, called Reconfigurable Video Coding (RVC). His main idea was to develop a video coding standard and gradually incrementase to update a library of components, providing flexibility and the ability to have a reconfigurable code using a new flow -oriented language Actors / data called CAL. This language is used for the specification of the standard library and to the instantiation model decoder. Later, a new video coding standard called High Efficiency Video Coding (HEVC), which currently is in continuous process of updating and development, which would improve the compression efficiency and video coding is developed. Obviously has developed a vision of using the methodology HEVC RVC. In this PFC, different implementations using RVC standard are used. For example, using decoders MPEG 4 Part 2 SP and MPEG 4 Part 10 CBP and PHP and the new coding standard HEVC, highlighting the features and usefulness of each. In RVC, the algorithms are described by a class of actors that exchange streams of data (tokens) to perform different actions. The objective of this project is to develop a program that, based on the aforementioned decoders, a series of video stream in different compression formats and a standard distribution of actors (for each of the decoders), is capable of generating different distributions decoder actors on one or more processors of the system on which it runs, to achieve greater efficiency in video coding. The purpose of the program developed in this project is to facilitate the realization of the distributions of the actors on the cores of the system, and get the best possible settings automatically and efficiently.