963 resultados para Self-adaptive software
Resumo:
This paper reports on a new façade system that uses passive solutions in the search for energy efficiency. The differentials are the versatility and flexibility of the modules, which are important advantages of the system. The thermal performance of Trombe walls and glazings and the daylighting performance of glazing were the key aspects analyzed in the results. Computational simulations were accomplished for the thermal performance of different arrangements of the modules with DesignBuilder software. The glazing daylighting performance was studied by means of Ecotect and Desktop Radiance programs and compared with the transmittance curves of glazings. Occupancy profile and internal gains were fixed according to the Portuguese reality for both studies. The main characteristics considered in this research were the use of two double glazings, four different climates in Portugal and one and two Trombe walls in the façade. The results show an important reduction in the energy consumption with the use of Trombe walls and double self-cleaning glazing in the façade, which also presented better daylighting performance.
Resumo:
"Published online: 07 Nov 2015"
Resumo:
The data acquisition process in real-time is fundamental to provide appropriate services and improve health professionals decision. In this paper a pervasive adaptive data acquisition architecture of medical devices (e.g. vital signs, ventilators and sensors) is presented. The architecture was deployed in a real context in an Intensive Care Unit. It is providing clinical data in real-time to the INTCare system. The gateway is composed by several agents able to collect a set of patients’ variables (vital signs, ventilation) across the network. The paper shows as example the ventilation acquisition process. The clients are installed in a machine near the patient bed. Then they are connected to the ventilators and the data monitored is sent to a multithreading server which using Health Level Seven protocols records the data in the database. The agents associated to gateway are able to collect, analyse, interpret and store the data in the repository. This gateway is composed by a fault tolerant system that ensures a data store in the database even if the agents are disconnected. The gateway is pervasive, universal, and interoperable and it is able to adapt to any service using streaming data.
Resumo:
OBJECTIVE: To evaluate the level of satisfaction with body weight and the self-perception of the weight/height ratio and to verify the influence of the frequency of present and past physical activity on these variables. METHODS: Using questionnaires or interviews, we obtained height data, reported and desired weight, self-perception of the weight/height ratio, and the frequency of current physical activity in 844 adults (489 women). Of these, evaluated the frequency of physical activity during high school of 193 individuals,and we measured their height and weight. RESULTS: Less than 2/3 of the individuals had body mass index between 20 and 24.9 kg/m2. A tendency existed to overestimate height by less than 1 cm and to underestimate weight by less than 1kg. Desired weight was less than that reported (p<0.001), and only 20% were satisfied with their current weight. Only 42% of men and 25% of women exercised regularly. No association was found between the frequency of physical activity and the variables height, weight, and body mass index, and the level of satisfaction with current weight. CONCLUSION: Height and weight reported seem to be valid for epidemological studies, and great dissatisfaction with body weight and a distorted self-perception of height/weight ratio exists, especially in women, regardless of the frequency of physical activity.
Resumo:
Identificación y caracterización del problema. Uno de los problemas más importantes asociados con la construcción de software es la corrección del mismo. En busca de proveer garantías del correcto funcionamiento del software, han surgido una variedad de técnicas de desarrollo con sólidas bases matemáticas y lógicas conocidas como métodos formales. Debido a su naturaleza, la aplicación de métodos formales requiere gran experiencia y conocimientos, sobre todo en lo concerniente a matemáticas y lógica, por lo cual su aplicación resulta costosa en la práctica. Esto ha provocado que su principal aplicación se limite a sistemas críticos, es decir, sistemas cuyo mal funcionamiento puede causar daños de magnitud, aunque los beneficios que sus técnicas proveen son relevantes a todo tipo de software. Poder trasladar los beneficios de los métodos formales a contextos de desarrollo de software más amplios que los sistemas críticos tendría un alto impacto en la productividad en tales contextos. Hipótesis. Contar con herramientas de análisis automático es un elemento de gran importancia. Ejemplos de esto son varias herramientas potentes de análisis basadas en métodos formales, cuya aplicación apunta directamente a código fuente. En la amplia mayoría de estas herramientas, la brecha entre las nociones a las cuales están acostumbrados los desarrolladores y aquellas necesarias para la aplicación de estas herramientas de análisis formal sigue siendo demasiado amplia. Muchas herramientas utilizan lenguajes de aserciones que escapan a los conocimientos y las costumbres usuales de los desarrolladores. Además, en muchos casos la salida brindada por la herramienta de análisis requiere cierto manejo del método formal subyacente. Este problema puede aliviarse mediante la producción de herramientas adecuadas. Otro problema intrínseco a las técnicas automáticas de análisis es cómo se comportan las mismas a medida que el tamaño y complejidad de los elementos a analizar crece (escalabilidad). Esta limitación es ampliamente conocida y es considerada crítica en la aplicabilidad de métodos formales de análisis en la práctica. Una forma de atacar este problema es el aprovechamiento de información y características de dominios específicos de aplicación. Planteo de objetivos. Este proyecto apunta a la construcción de herramientas de análisis formal para contribuir a la calidad, en cuanto a su corrección funcional, de especificaciones, modelos o código, en el contexto del desarrollo de software. Más precisamente, se busca, por un lado, identificar ambientes específicos en los cuales ciertas técnicas de análisis automático, como el análisis basado en SMT o SAT solving, o el model checking, puedan llevarse a niveles de escalabilidad superiores a los conocidos para estas técnicas en ámbitos generales. Se intentará implementar las adaptaciones a las técnicas elegidas en herramientas que permitan su uso a desarrolladores familiarizados con el contexto de aplicación, pero no necesariamente conocedores de los métodos o técnicas subyacentes. Materiales y métodos a utilizar. Los materiales a emplear serán bibliografía relevante al área y equipamiento informático. Métodos. Se emplearán los métodos propios de la matemática discreta, la lógica y la ingeniería de software. Resultados esperados. Uno de los resultados esperados del proyecto es la individualización de ámbitos específicos de aplicación de métodos formales de análisis. Se espera que como resultado del desarrollo del proyecto surjan herramientas de análisis cuyo nivel de usabilidad sea adecuado para su aplicación por parte de desarrolladores sin formación específica en los métodos formales utilizados. Importancia del proyecto. El principal impacto de este proyecto será la contribución a la aplicación práctica de técnicas formales de análisis en diferentes etapas del desarrollo de software, con la finalidad de incrementar su calidad y confiabilidad. A crucial factor for software quality is correcteness. Traditionally, formal approaches to software development concentrate on functional correctness, and tackle this problem basically by being based on well defined notations founded on solid mathematical grounds. This makes formal methods better suited for analysis, due to their precise semantics, but they are usually more complex, and require familiarity and experience with the manipulation of mathematical definitions. So, their acceptance by software engineers is rather restricted, and formal methods applications have been confined to critical systems. Nevertheless, it is obvious that the advantages that formal methods provide apply to any kind of software system. It is accepted that appropriate software tool support for formal analysis is essential, if one seeks providing support for software development based on formal methods. Indeed, some of the relatively recent sucesses of formal methods are accompanied by good quality tools that automate powerful analysis mechanisms, and are even integrated in widely used development environments. Still, most of these tools either concentrate on code analysis, and in many cases are still far from being simple enough to be employed by software engineers without experience in formal methods. Another important problem for the adoption of tool support for formal methods is scalability. Automated software analysis is intrinsically complex, and thus techniques do not scale well in the general case. In this project, we will attempt to identify particular modelling, design, specification or coding activities in software development processes where to apply automated formal analysis techniques. By focusing in very specific application domains, we expect to find characteristics that might be exploited to increase the scalability of the corresponding analyses, compared to the general case.
Resumo:
Transmission of Cherenkov light through the atmosphere is strongly influenced by the optical clarity of the atmosphere and the prevailing weather conditions. The performance of telescopes measuring this light is therefore dependent on atmospheric effects. This thesis presents software and hardware developed to implement a prototype sky monitoring system for use on the proposed next-generation gamma-ray telescope array, VERITAS. The system, consisting of a CCD camera and a far-infrared pyrometer, was successfully installed and tested on the ten metre atmospheric Cherenkov imaging telescope operated by the VERITAS Collaboration at the F.L. Whipple Observatory in Arizona. The thesis also presents the results of observations of the BL Lacertae object, 1ES1959+650, made with the Whipple ten metre telescope. The observations provide evidence for TeV gamma-ray emission from the BL Lacertae object, 1ES1959+650, at a level of more than 15 standard deviations above background. This represents the first unequivocal detection of this object at TeV energies, making it only the third extragalactic source seen at such levels of significance in this energy range. The flux variability of the source on a number of timescales is also investigated.