859 resultados para Point-of-care systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This systematic review aimed to evaluate if the internal connection is more efficient than the external connection and its associated influencing factors. A specific question was formulated according to the Population, Intervention, Control, and Outcome (PICO): Is internal connection more efficient than external connection in mechanical, biological, and esthetical point of views? An electronic search of the MEDLINE and the Web of Knowledge databases was performed for relevant studies published in English up to November 2013 by two independent reviewers. The keywords used in the search included a combination of dental implant and internal connection or Morse connection or external connection. Selected studies were randomized clinical trials, prospective or retrospective studies, and in vitro studies with a clear aim of investigating the internal and/or external implant connection use. From an initial screening yield of 674 articles, 64 potentially relevant articles were selected after an evaluation of their titles and abstracts. Full texts of these articles were obtained with 29 articles fulfilling the inclusion criteria. Morse taper connection has the best sealing ability. Concerning crestal bone loss, internal connections presented better results than external connections. The limitation of the present study was the absence of randomized clinical trials that investigated if the internal connection was more efficient than the external connection. The external and internal connections have different mechanical, biological, and esthetical characteristics. Besides all systems that show proper success rates and effectiveness, crestal bone level maintenance is more important around internal connections than external connections. The Morse taper connection seems to be more efficient concerning biological aspects, allowing lower bacterial leakage and bone loss in single implants, including aesthetic regions. Additionally, this connection type can be successfully indicated for fixed partial prostheses and overdenture planning, since it exhibits high mechanical stability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional software engineering approaches and metaphors fall short when applied to areas of growing relevance such as electronic commerce, enterprise resource planning, and mobile computing: such areas, in fact, generally call for open architectures that may evolve dynamically over time so as to accommodate new components and meet new requirements. This is probably one of the main reasons that the agent metaphor and the agent-oriented paradigm are gaining momentum in these areas. This thesis deals with the engineering of complex software systems in terms of the agent paradigm. This paradigm is based on the notions of agent and systems of interacting agents as fundamental abstractions for designing, developing and managing at runtime typically distributed software systems. However, today the engineer often works with technologies that do not support the abstractions used in the design of the systems. For this reason the research on methodologies becomes the basic point in the scientific activity. Currently most agent-oriented methodologies are supported by small teams of academic researchers, and as a result, most of them are in an early stage and still in the first context of mostly \academic" approaches for agent-oriented systems development. Moreover, such methodologies are not well documented and very often defined and presented only by focusing on specific aspects of the methodology. The role played by meta- models becomes fundamental for comparing and evaluating the methodologies. In fact a meta-model specifies the concepts, rules and relationships used to define methodologies. Although it is possible to describe a methodology without an explicit meta-model, formalising the underpinning ideas of the methodology in question is valuable when checking its consistency or planning extensions or modifications. A good meta-model must address all the different aspects of a methodology, i.e. the process to be followed, the work products to be generated and those responsible for making all this happen. In turn, specifying the work products that must be developed implies dening the basic modelling building blocks from which they are built. As a building block, the agent abstraction alone is not enough to fully model all the aspects related to multi-agent systems in a natural way. In particular, different perspectives exist on the role that environment plays within agent systems: however, it is clear at least that all non-agent elements of a multi-agent system are typically considered to be part of the multi-agent system environment. The key role of environment as a first-class abstraction in the engineering of multi-agent system is today generally acknowledged in the multi-agent system community, so environment should be explicitly accounted for in the engineering of multi-agent system, working as a new design dimension for agent-oriented methodologies. At least two main ingredients shape the environment: environment abstractions - entities of the environment encapsulating some functions -, and topology abstractions - entities of environment that represent the (either logical or physical) spatial structure. In addition, the engineering of non-trivial multi-agent systems requires principles and mechanisms for supporting the management of the system representation complexity. These principles lead to the adoption of a multi-layered description, which could be used by designers to provide different levels of abstraction over multi-agent systems. The research in these fields has lead to the formulation of a new version of the SODA methodology where environment abstractions and layering principles are exploited for en- gineering multi-agent systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The subject of this Ph.D. research thesis is the development and application of multiplexed analytical methods based on bioluminescent whole-cell biosensors. One of the main goals of analytical chemistry is multianalyte testing in which two or more analytes are measured simultaneously in a single assay. The advantages of multianalyte testing are work simplification, high throughput, and reduction in the overall cost per test. The availability of multiplexed portable analytical systems is of particular interest for on-field analysis of clinical, environmental or food samples as well as for the drug discovery process. To allow highly sensitive and selective analysis, these devices should combine biospecific molecular recognition with ultrasensitive detection systems. To address the current need for rapid, highly sensitive and inexpensive devices for obtaining more data from each sample,genetically engineered whole-cell biosensors as biospecific recognition element were combined with ultrasensitive bioluminescence detection techniques. Genetically engineered cell-based sensing systems were obtained by introducing into bacterial, yeast or mammalian cells a vector expressing a reporter protein whose expression is controlled by regulatory proteins and promoter sequences. The regulatory protein is able to recognize the presence of the analyte (e.g., compounds with hormone-like activity, heavy metals…) and to consequently activate the expression of the reporter protein that can be readily measured and directly related to the analyte bioavailable concentration in the sample. Bioluminescence represents the ideal detection principle for miniaturized analytical devices and multiplexed assays thanks to high detectability in small sample volumes allowing an accurate signal localization and quantification. In the first chapter of this dissertation is discussed the obtainment of improved bioluminescent proteins emitting at different wavelenghts, in term of increased thermostability, enhanced emission decay kinetic and spectral resolution. The second chapter is mainly focused on the use of these proteins in the development of whole-cell based assay with improved analytical performance. In particular since the main drawback of whole-cell biosensors is the high variability of their analyte specific response mainly caused by variations in cell viability due to aspecific effects of the sample’s matrix, an additional bioluminescent reporter has been introduced to correct the analytical response thus increasing the robustness of the bioassays. The feasibility of using a combination of two or more bioluminescent proteins for obtaining biosensors with internal signal correction or for the simultaneous detection of multiple analytes has been demonstrated by developing a dual reporter yeast based biosensor for androgenic activity measurement and a triple reporter mammalian cell-based biosensor for the simultaneous monitoring of two CYP450 enzymes activation, involved in cholesterol degradation, with the use of two spectrally resolved intracellular luciferases and a secreted luciferase as a control for cells viability. In the third chapter is presented the development of a portable multianalyte detection system. In order to develop a portable system that can be used also outside the laboratory environment even by non skilled personnel, cells have been immobilized into a new biocompatible and transparent polymeric matrix within a modified clear bottom black 384 -well microtiter plate to obtain a bioluminescent cell array. The cell array was placed in contact with a portable charge-coupled device (CCD) light sensor able to localize and quantify the luminescent signal produced by different bioluminescent whole-cell biosensors. This multiplexed biosensing platform containing whole-cell biosensors was successfully used to measure the overall toxicity of a given sample as well as to obtain dose response curves for heavy metals and to detect hormonal activity in clinical samples (PCT/IB2010/050625: “Portable device based on immobilized cells for the detection of analytes.” Michelini E, Roda A, Dolci LS, Mezzanotte L, Cevenini L , 2010). At the end of the dissertation some future development steps are also discussed in order to develop a point of care (POCT) device that combine portability, minimum sample pre-treatment and highly sensitive multiplexed assays in a short assay time. In this POCT perspective, field-flow fractionation (FFF) techniques, in particular gravitational variant (GrFFF) that exploit the earth gravitational field to structure the separation, have been investigated for cells fractionation, characterization and isolation. Thanks to the simplicity of its equipment, amenable to miniaturization, the GrFFF techniques appears to be particularly suited for its implementation in POCT devices and may be used as pre-analytical integrated module to be applied directly to drive target analytes of raw samples to the modules where biospecifc recognition reactions based on ultrasensitive bioluminescence detection occurs, providing an increase in overall analytical output.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays microfluidic is becoming an important technology in many chemical and biological processes and analysis applications. The potential to replace large-scale conventional laboratory instrumentation with miniaturized and self-contained systems, (called lab-on-a-chip (LOC) or point-of-care-testing (POCT)), offers a variety of advantages such as low reagent consumption, faster analysis speeds, and the capability of operating in a massively parallel scale in order to achieve high-throughput. Micro-electro-mechanical-systems (MEMS) technologies enable both the fabrication of miniaturized system and the possibility of developing compact and portable systems. The work described in this dissertation is towards the development of micromachined separation devices for both high-speed gas chromatography (HSGC) and gravitational field-flow fractionation (GrFFF) using MEMS technologies. Concerning the HSGC, a complete platform of three MEMS-based GC core components (injector, separation column and detector) is designed, fabricated and characterized. The microinjector consists of a set of pneumatically driven microvalves, based on a polymeric actuating membrane. Experimental results demonstrate that the microinjector is able to guarantee low dead volumes, fast actuation time, a wide operating temperature range and high chemical inertness. The microcolumn consists of an all-silicon microcolumn having a nearly circular cross-section channel. The extensive characterization has produced separation performances very close to the theoretical ideal expectations. A thermal conductivity detector (TCD) is chosen as most proper detector to be miniaturized since the volume reduction of the detector chamber results in increased mass and reduced dead volumes. The microTDC shows a good sensitivity and a very wide dynamic range. Finally a feasibility study for miniaturizing a channel suited for GrFFF is performed. The proposed GrFFF microchannel is at early stage of development, but represents a first step for the realization of a highly portable and potentially low-cost POCT device for biomedical applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Infektiöse Komplikationen im Zusammenhang mit Implantaten stellen einen Großteil aller Krankenhausinfektionen dar und treiben die Gesundheitskosten signifikant in die Höhe. Die bakterielle Kolonisation von Implantatoberflächen zieht schwerwiegende medizinische Konsequenzen nach sich, die unter Umständen tödlich verlaufen können. Trotz umfassender Forschungsaktivitäten auf dem Gebiet der antibakteriellen Oberflächenbeschichtungen ist das Spektrum an wirksamen Substanzen aufgrund der Anpassungsfähigkeit und Ausbildung von Resistenzen verschiedener Mikroorganismen eingeschränkt. Die Erforschung und Entwicklung neuer antibakterieller Materialien ist daher von fundamentaler Bedeutung.rnIn der vorliegenden Arbeit wurden auf der Basis von Polymernanopartikeln und anorganischen/polymeren Verbundmaterialien verschiedene Systeme als Alternative zu bestehenden antibakteriellen Oberflächenbeschichtungen entwickelt. Polymerpartikel finden Anwendung in vielen verschiedenen Bereichen, da sowohl Größe als auch Zusammensetzung und Morphologie vielseitig gestaltet werden können. Mit Hilfe der Miniemulsionstechnik lassen sich u. A. funktionelle Polymernanopartikel im Größenbereich von 50-500 nm herstellen. Diese wurde im ersten System angewendet, um PEGylierte Poly(styrol)nanopartikel zu synthetisieren, deren anti-adhesives Potential in Bezug auf P. aeruginosa evaluiert wurde. Im zweiten System wurden sog. kontakt-aktive kolloide Dispersionen entwickelt, welche bakteriostatische Eigenschaften gegenüber S. aureus zeigten. In Analogie zum ersten System, wurden Poly(styrol)nanopartikel in Copolymerisation in Miniemulsion mit quaternären Ammoniumgruppen funktionalisiert. Als Costabilisator diente das zuvor quaternisierte, oberflächenaktive Monomer (2-Dimethylamino)ethylmethacrylat (qDMAEMA). Die Optimierung der antibakteriellen Eigenschaften wurde im nachfolgenden System realisiert. Hierbei wurde das oberflächenaktive Monomer qDMAEMA zu einem oberflächenaktiven Polyelektrolyt polymerisiert, welcher unter Anwendung von kombinierter Miniemulsions- und Lösemittelverdampfungstechnik, in entsprechende Polyelektrolytnanopartikel umgesetzt wurde. Infolge seiner oberflächenaktiven Eigenschaften, ließen sich aus dem Polyelektrolyt stabile Partikeldispersionen ohne Zusatz weiterer Tenside ausbilden. Die selektive Toxizität der Polyelektrolytnanopartikel gegenüber S. aureus im Unterschied zu Körperzellen, untermauert ihr vielversprechendes Potential als bakterizides, kontakt-aktives Reagenz. rnAufgrund ihrer antibakteriellen Eigenschaften wurden ZnO Nanopartikel ausgewählt und in verschiedene Freisetzungssysteme integriert. Hochdefinierte eckige ZnO Nanokristalle mit einem mittleren Durchmesser von 23 nm wurden durch thermische Zersetzung des Precursormaterials synthetisiert. Durch die nachfolgende Einkapselung in Poly(L-laktid) Latexpartikel wurden neue, antibakterielle und UV-responsive Hybridnanopartikel entwickelt. Durch die photokatalytische Aktivierung von ZnO mittels UV-Strahlung wurde der Abbau der ZnO/PLLA Hybridnanopartikel signifikant von mehreren Monaten auf mehrere Wochen verkürzt. Die Photoaktivierung von ZnO eröffnet somit die Möglichkeit einer gesteuerten Freisetzung von ZnO. Im nachfolgenden System wurden dünne Verbundfilme aus Poly(N-isopropylacrylamid)-Hydrogelschichten mit eingebetteten ZnO Nanopartikeln hergestellt, die als bakterizide Oberflächenbeschichtungen gegen E. coli zum Einsatz kamen. Mit minimalem Gehalt an ZnO zeigten die Filme eine vergleichbare antibakterielle Aktivität zu Silber-basierten Beschichtungen. Hierbei lässt sich der Gehalt an ZnO relativ einfach über die Filmdicke einstellen. Weiterhin erwiesen sich die Filme mit bakteriziden Konzentrationen an ZnO als nichtzytotoxisch gegenüber Körperzellen. Zusammenfassend wurden mehrere vielversprechende antibakterielle Prototypen entwickelt, die als potentielle Implantatbeschichtungen auf die jeweilige Anwendung weiterhin zugeschnitten und optimiert werden können.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cardiac troponin I (cTnI) and T (cTnT) have a high sequence homology across phyla and are sensitive and specific markers of myocardial damage. The purpose of this study was to evaluate the Cardiac Reader, a human point-of-care system for the determination of cTnT and myoglobin, and the Abbott Axsym System for the determination of cTnI and creatine kinase isoenzyme MB (CK-MB) in healthy dogs and in dogs at risk for acute myocardial damage because of gastric dilatation-volvulus (GDV) and blunt chest trauma (BCT). In healthy dogs (n = 56), cTnI was below detection limits (<0.1 microg/L) in 35 of 56 dogs (reference range 0-0.7 microg/L), and cTnT was not measurable (<0.05 ng/mL) in all but 1 dog. At presentation, cTnI, CK-MB, myoglobin, and lactic acid were all significantly higher in dogs with GDV (n = 28) and BCT (n = 8) than in control dogs (P < .001), but cTnT was significantly higher only in dogs with BCT (P = .033). Increased cTnI or cTnT values were found in 26 of 28 (highest values 1.1-369 microg/L) and 16 of 28 dogs (0.1-1.7 ng/mL) with GDV, and in 6 of 8 (2.3-82.4 microg/L) and 3 of 8 dogs (0.1-0.29 ng/mL) with BCT, respectively. In dogs suffering from GDV, cTnI and cTnT increased further within the first 48 hours (P < .001). Increased cardiac troponins suggestive of myocardial damage occurred in 93% of dogs with GDV and 75% with BCT. cTnI appeared more sensitive, but cTnT may be a negative prognostic indicator in GDV. Both systems tested seemed applicable for the measurement of canine cardiac troponins, with the Cardiac Reader particularly suitable for use in emergency settings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For countless communities around the world, acquiring access to safe drinking water is a daily challenge which many organizations endeavor to meet. The villages in the interior of Suriname have been the focus of many improved drinking water projects as most communities are without year-round access. Unfortunately, as many as 75% of the systems in Suriname fail within several years of implementation. These communities, scattered along the rivers and throughout the jungle, lack many of the resources required to sustain a centralized water treatment system. However, the centralized system in the village of Bendekonde on the Upper Suriname River has been operational for over 10 years and is often touted by other communities. The Bendekonde system is praised even though the technology does not differ significantly from other failed systems. Many of the water systems that fail in the interior fail due to a lack of resources available to the community to maintain the system. Typically, the more complex a system becomes, so does the demand for additional resources. Alternatives to centralized systems include technologies such as point-of-use water filters, which can greatly reduce the necessity for outside resources. In particular, ceramic point-of-use water filters offer a technology that can be reasonably managed in a low resource setting such as that in the interior of Suriname. This report investigates the appropriateness and effectiveness of ceramic filters constructed with local Suriname clay and compares the treatment effectiveness to that of the Bendekonde system. Results of this study showed that functional filters could be produced from Surinamese clay and that they were more effective, in a controlled laboratory setting, than the field performance of the Bendekonde system for removing total coliform. However, the Bendekonde system was more successful at removing E. coli. In a life-cycle assessment, ceramic water filters manufactured in Suriname and used in homes for a lifespan of 2 years were shown to have lower cumulative energy demand, as well as lower global warming potential than a centralized system similar to that used in Bendekonde.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Female inmates make up the fastest growing segment in our criminal justice system today. The rapidly increasing trend for female prisoners calls for enhanced efforts to strategically plan the correctional facilities that address the needs of this growing population, and to work with communities to prevent crime in women. The incarcerated women in the U.S. have an estimated 145,000 minor children who are predisposed to unique psychosocial problems as a result of parental incarceration.^ This study examined the patterns of care and outcomes for pregnant inmates and their infants in Texas state prisons between 1994 and 1996. The study population consists of 202 pregnant inmates who delivered in a 2-year period, and a randomly sampled comparison cohort of 804 women from general Texas population, matched on race and educational levels. Both quantitative and qualitative data were used to elucidate the inmates' risk-factor profile, delivery/birth outcomes, and the patterns of care during pregnancy. The continuity-of-care issues for this population were also explored.^ Epidemiologic data were derived from multiple record systems to establish the comparison between two cohorts. A significantly great proportion of the inmates have prior lifestyle risk-factors (smoking, alcohol, and illicit drug abuse), poorer health status, and worse medical history. However, most of these existing risk-factors seem to show little manifestation in their current pregnancy. On the basis of maternal labor/delivery outcome and a number of neonatal indicators, this study found some evidence of a better pregnancy outcome for the inmate cohort when compared to the comparison group. Some possible explanations of this paradox were discussed. Seventeen percent of inmates gave birth to infants with suspected congenital syphilis. The placement patterns for the infants' care immediately after birth were elucidated.^ In addition to the quantitative data, an ethnographic approach was used to collect qualitative data from a subset of the inmate cohort (n = 20) and 12 care providers. The qualitative data were analyzed for their contents and themes, giving rise to a detailed description of the inmates' pregnancy experience. Eleven themes emerged from the study's thematic analysis, which provides the context for interpreting the epidemiologic data.^ Meaningful findings in this study were presented in a three-dimensional matrix to shed light on the apparent relationship between outcome indicators and their potential determinants. The suspected "linkages" between the outcome and their determinants can be used to generate hypotheses for future studies. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although research and clinical interventions for patients with dual disorders have been described since as early as the 1980s, the day-to-day treatment of these patients remains problematic and challenging in many countries. Throughout this book, many approaches and possible pathways have been outlined. Based upon these experiences, some key points can be extracted in order to guide to future developments. (1) New diagnostic approaches are warranted when dealing with patients who have multiple problems, given the limitations of the current categorical systems. (2) Greater emphasis should be placed on secondary prevention and early intervention for children and adolescents at an increased risk of later-life dual disorders. (3) Mental, addiction, and somatic care systems can be integrated, adopting a patient-focused approach to care delivery. (4) Recovery should be taken into consideration when defining treatment intervention and outcome goals. (5) It is important to reduce societal risk factors, such as poverty and early childhood adversity. (6) More resources are needed to provide adequate mental health care in the various countries. The development of European guidance initiatives would provide benefits in many of these areas, making it possible to ensure a more harmonized standard of care for patients with dual disorders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Health Information Exchange (HIE) will play a key part in our nation’s effort to improve healthcare. The evidence of HIEs transformational role in healthcare delivery systems is quite limited. The lack of such evidence led us to explore what exists in the healthcare industry that may provide evidence of effectiveness and efficiency of HIEs. The objective of the study was to find out how many fully functional HIEs are using any measurements or metrics to gauge impact of HIE on quality improvement (QI) and on return on investment (ROI).^ A web-based survey was used to determine the number of operational HIEs using metrics for QI and ROI. Our study highlights the fact that only 50 percent of the HIEs who responded use or plan to use metrics. However, 95 percent of the respondents believed HIEs improve quality of care while only 56 percent believed HIE showed positive ROI. Although operational HIEs present numerous opportunities to demonstrate the business model for improving health care quality, evidence to document the impact of HIEs is lacking. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An accurate characterization of the near-region propagation of radio waves inside tunnels is of practical importance for the design and planning of advanced communication systems. However, there has been no consensus yet on the propagation mechanism in this region. Some authors claim that the propagation mechanism follows the free space model, others intend to interpret it by the multi-mode waveguide model. This paper clarifies the situation in the near-region of arched tunnels by analytical modeling of the division point between the two propagation mechanisms. The procedure is based on the combination of the propagation theory and the three-dimensional solid geometry. Three groups of measurements are employed to verify the model in different tunnels at different frequencies. Furthermore, simplified models for the division point in five specific application situations are derived to facilitate the use of the model. The results in this paper could help to deepen the insight into the propagation mechanism within tunnel environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La hipótesis de esta tesis es: "La optimización de la ventana considerando simultáneamente aspectos energéticos y aspectos relativos a la calidad ambiental interior (confort higrotérmico, lumínico y acústico) es compatible, siempre que se conozcan y consideren las sinergias existentes entre ellos desde las primeras fases de diseño". En la actualidad se desconocen las implicaciones de muchas de las decisiones tomadas en torno a la ventana; para que su eficiencia en relación a todos los aspectos mencionados pueda hacerse efectiva es necesaria una herramienta que aporte más información de la actualmente disponible en el proceso de diseño, permitiendo así la optimización integral, en función de las circunstancias específicas de cada proyecto. En la fase inicial de esta investigación se realiza un primer acercamiento al tema, a través del estado del arte de la ventana; analizando la normativa existente, los componentes, las prestaciones, los elementos experimentales y la investigación. Se observa que, en ocasiones, altos requisitos de eficiencia energética pueden suponer una disminución de las prestaciones del sistema en relación con la calidad ambiental interior, por lo que surge el interés por integrar al análisis energético aspectos relativos a la calidad ambiental interior, como son las prestaciones lumínicas y acústicas y la renovación de aire. En este punto se detecta la necesidad de realizar un estudio integral que incorpore los distintos aspectos y evaluar las sinergias que se dan entre las distintas prestaciones que cumple la ventana. Además, del análisis de las soluciones innovadoras y experimentales se observa la dificultad de determinar en qué medida dichas soluciones son eficientes, ya que son soluciones complejas, no caracterizadas y que no están incorporadas en las metodologías de cálculo o en las bases de datos de los programas de simulación. Por lo tanto, se plantea una segunda necesidad, generar una metodología experimental para llevar a cabo la caracterización y el análisis de la eficiencia de sistemas innovadores. Para abordar esta doble necesidad se plantea la optimización mediante una evaluación del elemento acristalado que integre la eficiencia energética y la calidad ambiental interior, combinando la investigación teórica y la investigación experimental. En el ámbito teórico, se realizan simulaciones, cálculos y recopilación de información de distintas tipologías de hueco, en relación con cada prestación de forma independiente (acústica, iluminación, ventilación). A pesar de haber partido con un enfoque integrador, resulta difícil esa integración detectándose una carencia de herramientas disponible. En el ámbito experimental se desarrolla una metodología para la evaluación del rendimiento y de aspectos ambientales de aplicación a elementos innovadores de difícil valoración mediante la metodología teórica. Esta evaluación consiste en el análisis comparativo experimental entre el elemento innovador y un elemento estándar; para llevar a cabo este análisis se han diseñado dos espacios iguales, que denominamos módulos de experimentación, en los que se han incorporado los dos sistemas; estos espacios se han monitorizado, obteniéndose datos de consumo, temperatura, iluminancia y humedad relativa. Se ha realizado una medición durante un periodo de nueve meses y se han analizado y comparado los resultados, obteniendo así el comportamiento real del sistema. Tras el análisis teórico y el experimental, y como consecuencia de esa necesidad de integrar el conocimiento existente se propone una herramienta de evaluación integral del elemento acristalado. El desarrollo de esta herramienta se realiza en base al procedimiento de diagnóstico de calidad ambiental interior (CAI) de acuerdo con la norma UNE 171330 “Calidad ambiental en interiores”, incorporando el factor de eficiencia energética. De la primera parte del proceso, la parte teórica y el estado del arte, se obtendrán los parámetros que son determinantes y los valores de referencia de dichos parámetros. En base a los parámetros relevantes obtenidos se da forma a la herramienta, que consiste en un indicador de producto para ventanas que integra todos los factores analizados y que se desarrolla según la Norma UNE 21929 “Sostenibilidad en construcción de edificios. Indicadores de sostenibilidad”. ABSTRACT The hypothesis of this thesis is: "The optimization of windows considering energy and indoor environmental quality issues simultaneously (hydrothermal comfort, lighting comfort, and acoustic comfort) is compatible, provided that the synergies between these issues are known and considered from the early stages of design ". The implications of many of the decisions made on this item are currently unclear. So that savings can be made, an effective tool is needed to provide more information during the design process than the currently available, thus enabling optimization of the system according to the specific circumstances of each project. The initial phase deals with the study from an energy efficiency point of view, performing a qualitative and quantitative analysis of commercial, innovative and experimental windows. It is observed that sometimes, high-energy efficiency requirements may mean a reduction in the system's performance in relation to user comfort and health, that's why there is an interest in performing an integrated analysis of indoor environment aspects and energy efficiency. At this point a need for a comprehensive study incorporating the different aspects is detected, to evaluate the synergies that exist between the various benefits that meet the window. Moreover, from the analysis of experimental and innovative windows, a difficulty in establishing to what extent these solutions are efficient is observed; therefore, there is a need to generate a methodology for performing the analysis of the efficiency of the systems. Therefore, a second need arises, to generate an experimental methodology to perform characterization and analysis of the efficiency of innovative systems. To address this dual need, the optimization of windows by an integrated evaluation arises, considering energy efficiency and indoor environmental quality, combining theoretical and experimental research. In the theoretical field, simulations and calculations are performed; also information about the different aspects of indoor environment (acoustics, lighting, ventilation) is gathered independently. Despite having started with an integrative approach, this integration is difficult detecting lack available tools. In the experimental field, a methodology for evaluating energy efficiency and indoor environment quality is developed, to be implemented in innovative elements which are difficult to evaluate using a theoretical methodology This evaluation is an experimental comparative analysis between an innovative element and a standard element. To carry out this analysis, two equal spaces, called experimental cells, have been designed. These cells have been monitored, obtaining consumption, temperature, luminance and relative humidity data. Measurement has been performed during nine months and results have been analyzed and compared, obtaining results of actual system behavior. To advance this optimization, windows have been studied from the point of view of energy performance and performance in relation to user comfort and health: thermal comfort, acoustic comfort, lighting comfort and air quality; proposing the development of a methodology for an integrated analysis including energy efficiency and indoor environment quality. After theoretical and experimental analysis and as a result of the need to integrate existing knowledge, a comprehensive evaluation procedure for windows is proposed. This evaluation procedure is developed according to the UNE 171330 "Indoor Environmental Quality", also incorporating energy efficiency and cost as factors to evaluate. From the first part of the research process, outstanding parameters are chosen and reference values of these parameters are set. Finally, based on the parameters obtained, an indicator is proposed as windows product indicator. The indicator integrates all factors analyzed and is developed according to ISO 21929-1:2011"Sustainability in building construction. Sustainability indicators. Part 1: Framework for the development of indicators and a core set of indicators for buildings".

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El uso de aritmética de punto fijo es una opción de diseño muy extendida en sistemas con fuertes restricciones de área, consumo o rendimiento. Para producir implementaciones donde los costes se minimicen sin impactar negativamente en la precisión de los resultados debemos llevar a cabo una asignación cuidadosa de anchuras de palabra. Encontrar la combinación óptima de anchuras de palabra en coma fija para un sistema dado es un problema combinatorio NP-hard al que los diseñadores dedican entre el 25 y el 50 % del ciclo de diseño. Las plataformas hardware reconfigurables, como son las FPGAs, también se benefician de las ventajas que ofrece la aritmética de coma fija, ya que éstas compensan las frecuencias de reloj más bajas y el uso más ineficiente del hardware que hacen estas plataformas respecto a los ASICs. A medida que las FPGAs se popularizan para su uso en computación científica los diseños aumentan de tamaño y complejidad hasta llegar al punto en que no pueden ser manejados eficientemente por las técnicas actuales de modelado de señal y ruido de cuantificación y de optimización de anchura de palabra. En esta Tesis Doctoral exploramos distintos aspectos del problema de la cuantificación y presentamos nuevas metodologías para cada uno de ellos: Las técnicas basadas en extensiones de intervalos han permitido obtener modelos de propagación de señal y ruido de cuantificación muy precisos en sistemas con operaciones no lineales. Nosotros llevamos esta aproximación un paso más allá introduciendo elementos de Multi-Element Generalized Polynomial Chaos (ME-gPC) y combinándolos con una técnica moderna basada en Modified Affine Arithmetic (MAA) estadístico para así modelar sistemas que contienen estructuras de control de flujo. Nuestra metodología genera los distintos caminos de ejecución automáticamente, determina las regiones del dominio de entrada que ejercitarán cada uno de ellos y extrae los momentos estadísticos del sistema a partir de dichas soluciones parciales. Utilizamos esta técnica para estimar tanto el rango dinámico como el ruido de redondeo en sistemas con las ya mencionadas estructuras de control de flujo y mostramos la precisión de nuestra aproximación, que en determinados casos de uso con operadores no lineales llega a tener tan solo una desviación del 0.04% con respecto a los valores de referencia obtenidos mediante simulación. Un inconveniente conocido de las técnicas basadas en extensiones de intervalos es la explosión combinacional de términos a medida que el tamaño de los sistemas a estudiar crece, lo cual conlleva problemas de escalabilidad. Para afrontar este problema presen tamos una técnica de inyección de ruidos agrupados que hace grupos con las señales del sistema, introduce las fuentes de ruido para cada uno de los grupos por separado y finalmente combina los resultados de cada uno de ellos. De esta forma, el número de fuentes de ruido queda controlado en cada momento y, debido a ello, la explosión combinatoria se minimiza. También presentamos un algoritmo de particionado multi-vía destinado a minimizar la desviación de los resultados a causa de la pérdida de correlación entre términos de ruido con el objetivo de mantener los resultados tan precisos como sea posible. La presente Tesis Doctoral también aborda el desarrollo de metodologías de optimización de anchura de palabra basadas en simulaciones de Monte-Cario que se ejecuten en tiempos razonables. Para ello presentamos dos nuevas técnicas que exploran la reducción del tiempo de ejecución desde distintos ángulos: En primer lugar, el método interpolativo aplica un interpolador sencillo pero preciso para estimar la sensibilidad de cada señal, y que es usado después durante la etapa de optimización. En segundo lugar, el método incremental gira en torno al hecho de que, aunque es estrictamente necesario mantener un intervalo de confianza dado para los resultados finales de nuestra búsqueda, podemos emplear niveles de confianza más relajados, lo cual deriva en un menor número de pruebas por simulación, en las etapas iniciales de la búsqueda, cuando todavía estamos lejos de las soluciones optimizadas. Mediante estas dos aproximaciones demostramos que podemos acelerar el tiempo de ejecución de los algoritmos clásicos de búsqueda voraz en factores de hasta x240 para problemas de tamaño pequeño/mediano. Finalmente, este libro presenta HOPLITE, una infraestructura de cuantificación automatizada, flexible y modular que incluye la implementación de las técnicas anteriores y se proporciona de forma pública. Su objetivo es ofrecer a desabolladores e investigadores un entorno común para prototipar y verificar nuevas metodologías de cuantificación de forma sencilla. Describimos el flujo de trabajo, justificamos las decisiones de diseño tomadas, explicamos su API pública y hacemos una demostración paso a paso de su funcionamiento. Además mostramos, a través de un ejemplo sencillo, la forma en que conectar nuevas extensiones a la herramienta con las interfaces ya existentes para poder así expandir y mejorar las capacidades de HOPLITE. ABSTRACT Using fixed-point arithmetic is one of the most common design choices for systems where area, power or throughput are heavily constrained. In order to produce implementations where the cost is minimized without negatively impacting the accuracy of the results, a careful assignment of word-lengths is required. The problem of finding the optimal combination of fixed-point word-lengths for a given system is a combinatorial NP-hard problem to which developers devote between 25 and 50% of the design-cycle time. Reconfigurable hardware platforms such as FPGAs also benefit of the advantages of fixed-point arithmetic, as it compensates for the slower clock frequencies and less efficient area utilization of the hardware platform with respect to ASICs. As FPGAs become commonly used for scientific computation, designs constantly grow larger and more complex, up to the point where they cannot be handled efficiently by current signal and quantization noise modelling and word-length optimization methodologies. In this Ph.D. Thesis we explore different aspects of the quantization problem and we present new methodologies for each of them: The techniques based on extensions of intervals have allowed to obtain accurate models of the signal and quantization noise propagation in systems with non-linear operations. We take this approach a step further by introducing elements of MultiElement Generalized Polynomial Chaos (ME-gPC) and combining them with an stateof- the-art Statistical Modified Affine Arithmetic (MAA) based methodology in order to model systems that contain control-flow structures. Our methodology produces the different execution paths automatically, determines the regions of the input domain that will exercise them, and extracts the system statistical moments from the partial results. We use this technique to estimate both the dynamic range and the round-off noise in systems with the aforementioned control-flow structures. We show the good accuracy of our approach, which in some case studies with non-linear operators shows a 0.04 % deviation respect to the simulation-based reference values. A known drawback of the techniques based on extensions of intervals is the combinatorial explosion of terms as the size of the targeted systems grows, which leads to scalability problems. To address this issue we present a clustered noise injection technique that groups the signals in the system, introduces the noise terms in each group independently and then combines the results at the end. In this way, the number of noise sources in the system at a given time is controlled and, because of this, the combinato rial explosion is minimized. We also present a multi-way partitioning algorithm aimed at minimizing the deviation of the results due to the loss of correlation between noise terms, in order to keep the results as accurate as possible. This Ph.D. Thesis also covers the development of methodologies for word-length optimization based on Monte-Carlo simulations in reasonable times. We do so by presenting two novel techniques that explore the reduction of the execution times approaching the problem in two different ways: First, the interpolative method applies a simple but precise interpolator to estimate the sensitivity of each signal, which is later used to guide the optimization effort. Second, the incremental method revolves on the fact that, although we strictly need to guarantee a certain confidence level in the simulations for the final results of the optimization process, we can do it with more relaxed levels, which in turn implies using a considerably smaller amount of samples, in the initial stages of the process, when we are still far from the optimized solution. Through these two approaches we demonstrate that the execution time of classical greedy techniques can be accelerated by factors of up to ×240 for small/medium sized problems. Finally, this book introduces HOPLITE, an automated, flexible and modular framework for quantization that includes the implementation of the previous techniques and is provided for public access. The aim is to offer a common ground for developers and researches for prototyping and verifying new techniques for system modelling and word-length optimization easily. We describe its work flow, justifying the taken design decisions, explain its public API and we do a step-by-step demonstration of its execution. We also show, through an example, the way new extensions to the flow should be connected to the existing interfaces in order to expand and improve the capabilities of HOPLITE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In some countries, photovoltaic (PV) technology is at a stage of development at which it can compete with conventional electricity sources in terms of electricity generation costs, i.e., grid parity. A case in point is Germany, where the PV market has reached a mature stage, the policy support has scaled down and the diffusion rate of PV systems has declined. This development raises a fundamental question: what are the motives to adopt PV systems at grid parity? The point of departure for the relevant literature has been on the impact of policy support, adopters and, recently, local solar companies. However, less attention has been paid to the motivators for adoption at grid parity. This paper presents an in-depth analysis of the diffusion of PV systems, explaining the impact of policy measures, adopters and system suppliers. Anchored in an extensive and exploratory case study in Germany, we provide a context-specific explanation to the motivations to adopt PV systems at grid parity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By using a simplified model of small open liquid-like clusters with surface effects, in the gas phase, it is shown how the statistical thermodynamics of small systems can be extended to include metastable supersaturated gaseous states not too far from the gas–liquid equilibrium transition point. To accomplish this, one has to distinguish between mathematical divergence and physical convergence of the open-system partition function.