954 resultados para Context-aware applications
Resumo:
Lava flow modeling can be a powerful tool in hazard assessments; however, the ability to produce accurate models is usually limited by a lack of high resolution, up-to-date Digital Elevation Models (DEMs). This is especially obvious in places such as Kilauea Volcano (Hawaii), where active lava flows frequently alter the terrain. In this study, we use a new technique to create high resolution DEMs on Kilauea using synthetic aperture radar (SAR) data from the TanDEM-X (TDX) satellite. We convert raw TDX SAR data into a geocoded DEM using GAMMA software [Werner et al., 2000]. This process can be completed in several hours and permits creation of updated DEMs as soon as new TDX data are available. To test the DEMs, we use the Harris and Rowland [2001] FLOWGO lava flow model combined with the Favalli et al. [2005] DOWNFLOW model to simulate the 3-15 August 2011 eruption on Kilauea's East Rift Zone. Results were compared with simulations using the older, lower resolution 2000 SRTM DEM of Hawaii. Effusion rates used in the model are derived from MODIS thermal infrared satellite imagery. FLOWGO simulations using the TDX DEM produced a single flow line that matched the August 2011 flow almost perfectly, but could not recreate the entire flow field due to the relatively high DEM noise level. The issues with short model flow lengths can be resolved by filtering noise from the DEM. Model simulations using the outdated SRTM DEM produced a flow field that followed a different trajectory to that observed. Numerous lava flows have been emplaced at Kilauea since the creation of the SRTM DEM, leading the model to project flow lines in areas that have since been covered by fresh lava flows. These results show that DEMs can quickly become outdated on active volcanoes, but our new technique offers the potential to produce accurate, updated DEMs for modeling lava flow hazards.
Resumo:
Among the optical structures investigated for optical sensing purpose, a significant amount of research has been conducted on photonic crystal based sensors. A particular advantage of photonic crystal based sensors is that they show superior sensitivity for ultra-small volume sensing. In this study we investigate polarization changes in response to the changes in the cover index of magneto-optic active photonic band gap structures. One-dimensional photonic-band gap structures fabricated on iron garnet materials yield large polarization rotations at the band gap edges. The enhanced polarization effects serve as an excellent tool for chemical sensing showing high degree of sensitivity for photonic crystal cover refractive index changes. The one dimensional waveguide photonic crystals are fabricated on single-layer bismuth-substituted rare earth iron garnet films ((Bi, Y, Lu)3(Fe, Ga)5O12 ) grown by liquid phase epitaxy on gadolinium gallium garnet substrates. Band gaps have been observed where Bragg scattering conditions links forward-going fundamental waveguide modes to backscattered high-order waveguide modes. Large near-band-edge polarization rotations which increase progressively with backscattered-mode order have been experimentally demonstrated for multiple samples with different composition, film thickness and fabrication parameters. Experimental findings are supported by theoretical analysis of Bloch modes polarization states showing that large near stop-band edge rotations are induced by the magneto-photonic crystal. Theoretical and experimental analysis conducted on polarization rotation sensitivity to waveguide photonic crystal cover refractive index changes shows a monotonic enhancement of the rotation with cover index. The sensor is further developed for selective chemical sensing by employing Polypyrrole as the photonic crystal cover layer. Polypyrrole is one of the extensively studied conducting polymers for selective analyte detection. Successful detection of aqueous ammonia and methanol has been achieved with Polypyrrole deposited magneto-photonic crystals.
Resumo:
Sensor networks have been an active research area in the past decade due to the variety of their applications. Many research studies have been conducted to solve the problems underlying the middleware services of sensor networks, such as self-deployment, self-localization, and synchronization. With the provided middleware services, sensor networks have grown into a mature technology to be used as a detection and surveillance paradigm for many real-world applications. The individual sensors are small in size. Thus, they can be deployed in areas with limited space to make unobstructed measurements in locations where the traditional centralized systems would have trouble to reach. However, there are a few physical limitations to sensor networks, which can prevent sensors from performing at their maximum potential. Individual sensors have limited power supply, the wireless band can get very cluttered when multiple sensors try to transmit at the same time. Furthermore, the individual sensors have limited communication range, so the network may not have a 1-hop communication topology and routing can be a problem in many cases. Carefully designed algorithms can alleviate the physical limitations of sensor networks, and allow them to be utilized to their full potential. Graphical models are an intuitive choice for designing sensor network algorithms. This thesis focuses on a classic application in sensor networks, detecting and tracking of targets. It develops feasible inference techniques for sensor networks using statistical graphical model inference, binary sensor detection, events isolation and dynamic clustering. The main strategy is to use only binary data for rough global inferences, and then dynamically form small scale clusters around the target for detailed computations. This framework is then extended to network topology manipulation, so that the framework developed can be applied to tracking in different network topology settings. Finally the system was tested in both simulation and real-world environments. The simulations were performed on various network topologies, from regularly distributed networks to randomly distributed networks. The results show that the algorithm performs well in randomly distributed networks, and hence requires minimum deployment effort. The experiments were carried out in both corridor and open space settings. A in-home falling detection system was simulated with real-world settings, it was setup with 30 bumblebee radars and 30 ultrasonic sensors driven by TI EZ430-RF2500 boards scanning a typical 800 sqft apartment. Bumblebee radars are calibrated to detect the falling of human body, and the two-tier tracking algorithm is used on the ultrasonic sensors to track the location of the elderly people.
Resumo:
Colloidal Nano-apatite Particles with Active Luminescent and Magentic Properties for Biotechnology Applications. The synthesis of functional nano-materials is a burgeoning field that has produced remarkable and consistent breakthroughs over the last two decades. Individual particles have become smaller and shown potential for well defined functionality. However, there are still unresolved problems, a primary one being the loss of functionality and novelty due to uncontrolled aggregation driven by surface energy considerations. As such the first design criteria to harness the true potential of nanoparticles is to prevent unwanted agglomeration by: (1) improving, and, if possible, (2) controlling aggregation behavior. This requires specific knowledge of the chemistry of the immediate locale of the intended application; especially for biologically relevant applications. The latter criterion is also application driven but should be considered, generally, to diversify the range of functional properties that can be achieved. We have now reason to believe that such a novel system with multifunctional capabilities can be synthesized rather conveniently and have far reaching impact in biotechnology and other applications in the near future. We are presently experimenting with the syntheses of spheroidal, metal-doped, colloidal apatite nano-particles (~10 nm) for several potential biomedical applications.
Resumo:
Graphene, which is a two-dimensional carbon material, exhibits unique properties that promise its potential applications in photovoltaic devices. Dye-sensitized solar cell (DSSC) is a representative of the third generation photovoltaic devices. Therefore, it is important to synthesize graphene with special structures, which possess excellent properties for dye-sensitized solar cells. This dissertation research was focused on (1) the effect of oxygen content on the structure of graphite oxide, (2) the stability of graphene oxide solution, (3) the application of graphene precipitate from graphene oxide solution as counter electrode for DSSCs, (4) the development of a novel synthesis method for the three-dimensional graphene with honeycomb-like structure, and (5) the exploration of honeycomb structured graphene (HSG) as counter electrodes for DSSCs. Graphite oxide is a crucial precursor to synthesize graphene sheets via chemical exfoliation method. The relationship between the oxygen content and the structures of graphite oxides was still not explored. In this research, the oxygen content of graphite oxide is tuned by changing the oxidation time and the effect of oxygen content on the structure of graphite oxide was evaluated. It has been found that the saturated ratio of oxygen to carbon is 0.47. The types of functional groups in graphite oxides, which are epoxy, hydroxyl, and carboxylgroups, are independent of oxygen content. However, the interplanar space and BET surface area of graphite oxide linearly increases with increasing O/C ratio. Graphene oxide (GO) can easily dissolve in water to form a stable homogeneous solution, which can be used to fabricate graphene films and graphene based composites. This work is the first research to evaluate the stability of graphene oxide solution. It has been found that the introduction of strong electrolytes (HCl, LiOH, LiCl) into GO solution can cause GO precipitation. This indicates that the electrostatic repulsion plays a critical role in stabilizing aqueous GO solution. Furthermore, the HCl-induced GO precipitation is a feasible approach to deposit GO sheets on a substrate as a Pt-free counter electrode for a dye-sensitized solar cell (DSSC), which exhibited 1.65% of power conversion efficiency. To explore broad and practical applications, large-scale synthesis with controllable integration of individual graphene sheets is essential. A novel strategy for the synthesis of graphene sheets with three-dimensional (3D) Honeycomb-like structure has been invented in this project based on a simple and novel chemical reaction (Li2O and CO to graphene and Li2CO3). The simultaneous formation of Li2CO3 with graphene not only can isolate graphene sheets from each other to prevent graphite formation during the process, but also determine the locally curved shape of graphene sheets. After removing Li2CO3, 3D graphene sheets with a honeycomb-like structure were obtained. This would be the first approach to synthesize 3D graphene sheets with a controllable shape. Furthermore, it has been demonstrated that the 3D Honeycomb-Structured Graphene (HSG) possesses excellent electrical conductivity and high catalytic activity. As a result, DSSCs with HSG counter electrodes exhibit energy conversion efficiency as high as 7.8%, which is comparable to that of an expensive noble Pt electrode.
Resumo:
Enterprise Applications are complex software systems that manipulate much persistent data and interact with the user through a vast and complex user interface. In particular applications written for the Java 2 Platform, Enterprise Edition (J2EE) are composed using various technologies such as Enterprise Java Beans (EJB) or Java Server Pages (JSP) that in turn rely on languages other than Java, such as XML or SQL. In this heterogeneous context applying existing reverse engineering and quality assurance techniques developed for object-oriented systems is not enough. Because those techniques have been created to measure quality or provide information about one aspect of J2EE applications, they cannot properly measure the quality of the entire system. We intend to devise techniques and metrics to measure quality in J2EE applications considering all their aspects and to aid their evolution. Using software visualization we also intend to inspect to structure of J2EE applications and all other aspects that can be investigate through this technique. In order to do that we also need to create a unified meta-model including all elements composing a J2EE application.
Resumo:
Context-dependent behavior is becoming increasingly important for a wide range of application domains, from pervasive computing to common business applications. Unfortunately, mainstream programming languages do not provide mechanisms that enable software entities to adapt their behavior dynamically to the current execution context. This leads developers to adopt convoluted designs to achieve the necessary runtime flexibility. We propose a new programming technique called Context-oriented Programming (COP) which addresses this problem. COP treats context explicitly, and provides mechanisms to dynamically adapt behavior in reaction to changes in context, even after system deployment at runtime. In this paper we lay the foundations of COP, show how dynamic layer activation enables multi-dimensional dispatch, illustrate the application of COP by examples in several language extensions, and demonstrate that COP is largely independent of other commitments to programming style.
Resumo:
We describe the use of log file analysis to investigate whether the use of CSCL applications corresponds to its didactical purposes. Exemplarily we examine the use of the web-based system CommSy as software support for project-oriented university courses. We present two findings: (1) We suggest measures to shape the context of CSCL applications and support their initial and continuous use. (2) We show how log files can be used to analyze how, when and by whom a CSCL system is used and thus help to validate further empirical findings. However, log file analyses can only be interpreted reasonably when additional data concerning the context of use is available.
Resumo:
Eine zunehmende Anzahl von Artikeln in Publikumszeitschriften und Journalen rückt die direkte Herstellung von Bauteilen und Figuren immer mehr in das Bewusstsein einer breiten Öffentlichkeit. Leider ergibt sich nur selten ein einigermaßen vollständiges Bild davon, wie und in welchen Lebensbereichen diese Techniken unseren Alltag verändern werden. Das liegt auch daran, dass die meisten Artikel sehr technisch geprägt sind und sich nur punktuell auf Beispiele stützen. Dieser Beitrag geht von den Bedürfnissen der Menschen aus, wie sie z.B. in der Maslow’schen Bedürfnispyramide strukturiert dargestellt sind und unterstreicht dadurch, dass 3D Printing (oder Additive Manufacturing resp. Rapid Prototyping) bereits alle Lebensbereiche erfasst hat und im Begriff ist, viele davon zu revolutionieren.
Resumo:
Mixed Reality (MR) aims to link virtual entities with the real world and has many applications such as military and medical domains [JBL+00, NFB07]. In many MR systems and more precisely in augmented scenes, one needs the application to render the virtual part accurately at the right time. To achieve this, such systems acquire data related to the real world from a set of sensors before rendering virtual entities. A suitable system architecture should minimize the delays to keep the overall system delay (also called end-to-end latency) within the requirements for real-time performance. In this context, we propose a compositional modeling framework for MR software architectures in order to specify, simulate and validate formally the time constraints of such systems. Our approach is first based on a functional decomposition of such systems into generic components. The obtained elements as well as their typical interactions give rise to generic representations in terms of timed automata. A whole system is then obtained as a composition of such defined components. To write specifications, a textual language named MIRELA (MIxed REality LAnguage) is proposed along with the corresponding compilation tools. The generated output contains timed automata in UPPAAL format for simulation and verification of time constraints. These automata may also be used to generate source code skeletons for an implementation on a MR platform. The approach is illustrated first on a small example. A realistic case study is also developed. It is modeled by several timed automata synchronizing through channels and including a large number of time constraints. Both systems have been simulated in UPPAAL and checked against the required behavioral properties.
Resumo:
Wireless Multimedia Sensor Networks (WMSNs) promise a wide scope of emerging potential applications in both civilian and military areas, which require visual and audio information to enhance the level of collected information. The transmission of multimedia content requires a minimal video quality level from the user’s perspective. However, links in WMSN communi- cations are typically unreliable, as they often experience fluctuations in quality and weak connectivity, and thus, the routing protocol must evaluate the routes by using end-to-end link quality information to increase the packet delivery ratio. Moreover, the use multiple paths together with key video metrics can enhance the video quality level. In this paper, we propose a video-aware multiple path hierarchical routing protocol for efficient multimedia transmission over WMSN, called video-aware MMtransmission. This protocol finds node-disjoint multiple paths, and implements an end-to-end link quality estimation with minimal over- head to score the paths. Thus, our protocol assures multimedia transmission with Quality of Experience (QoE) and energy-efficiency support. The simula- tion results show the benefits of video-aware MMtransmission for disseminating video content by means of energy-efficiency and QoE analysis.
Resumo:
This paper is a summary of the main contribu- tions of the PhD thesis published in [1]. The main research contributions of the thesis are driven by the research question how to design simple, yet efficient and robust run-time adaptive resource allocation schemes within the commu- nication stack of Wireless Sensor Network (WSN) nodes. The thesis addresses several problem domains with con- tributions on different layers of the WSN communication stack. The main contributions can be summarized as follows: First, a a novel run-time adaptive MAC protocol is intro- duced, which stepwise allocates the power-hungry radio interface in an on-demand manner when the encountered traffic load requires it. Second, the thesis outlines a metho- dology for robust, reliable and accurate software-based energy-estimation, which is calculated at network run- time on the sensor node itself. Third, the thesis evaluates several Forward Error Correction (FEC) strategies to adap- tively allocate the correctional power of Error Correcting Codes (ECCs) to cope with timely and spatially variable bit error rates. Fourth, in the context of TCP-based communi- cations in WSNs, the thesis evaluates distributed caching and local retransmission strategies to overcome the perfor- mance degrading effects of packet corruption and trans- mission failures when transmitting data over multiple hops. The performance of all developed protocols are eval- uated on a self-developed real-world WSN testbed and achieve superior performance over selected existing ap- proaches, especially where traffic load and channel condi- tions are suspect to rapid variations over time.
Resumo:
The Internet of Things (IoT) is attracting considerable attention from the universities, industries, citizens and governments for applications, such as healthcare, environmental monitoring and smart buildings. IoT enables network connectivity between smart devices at all times, everywhere, and about everything. In this context, Wireless Sensor Networks (WSNs) play an important role in increasing the ubiquity of networks with smart devices that are low-cost and easy to deploy. However, sensor nodes are restricted in terms of energy, processing and memory. Additionally, low-power radios are very sensitive to noise, interference and multipath distortions. In this context, this article proposes a routing protocol based on Routing by Energy and Link quality (REL) for IoT applications. To increase reliability and energy-efficiency, REL selects routes on the basis of a proposed end-to-end link quality estimator mechanism, residual energy and hop count. Furthermore, REL proposes an event-driven mechanism to provide load balancing and avoid the premature energy depletion of nodes/networks. Performance evaluations were carried out using simulation and testbed experiments to show the impact and benefits of REL in small and large-scale networks. The results show that REL increases the network lifetime and services availability, as well as the quality of service of IoT applications. It also provides an even distribution of scarce network resources and reduces the packet loss rate, compared with the performance of well-known protocols.
Resumo:
Wireless mobile sensor networks are enlarging the Internet of Things (IoT) portfolio with a huge number of multimedia services for smart cities. Safety and environmental monitoring multimedia applications will be part of the Smart IoT systems, which aim to reduce emergency response time, while also predicting hazardous events. In these mobile and dynamic (possible disaster) scenarios, opportunistic routing allows routing decisions in a completely distributed manner, by using a hop- by-hop route decision based on protocol-specific characteristics, and a predefined end-to-end path is not a reliable solution. This enables the transmission of video flows of a monitored area/object with Quality of Experience (QoE) support to users, headquarters or IoT platforms. However, existing approaches rely on a single metric to make the candidate selection rule, including link quality or geographic information, which causes a high packet loss rate, and reduces the video perception from the human standpoint. This article proposes a cross-layer Link quality and Geographical-aware Opportunistic routing protocol (LinGO), which is designed for video dissemination in mobile multimedia IoT environments. LinGO improves routing decisions using multiple metrics, including link quality, geographic loca- tion, and energy. The simulation results show the benefits of LinGO compared with well-known routing solutions for video transmission with QoE support in mobile scenarios.
Resumo:
In addition to self reports and questionnaires, biomarkers are of relevance in the diagnosis of and therapy for alcohol use disorders. Traditional biomarkers such as gamma-glutamyl transpeptidase or mean corpuscular volume are indirect biomarkers and are subject to the influence of age, gender and non-alcohol related diseases, among others. Direct metabolites of ethanol such as Ethyl glucuronide (EtG), ethyl sulphate (EtS) and phosphatidylethanol (PEth) are direct metabolites of ethanol, that are positive after intake of ethyl alcohol. They represent useful diagnostic tools for identifying alcohol use even more accurately than traditional biomarkers. Each of these drinking indicators remains positive in serum and urine for a characteristic time spectrum after the cessation of ethanol intake - EtG and EtS in urine up to 7 days, EtG in hair for months after ethanol has left the body. Applications include clinical routine use, emergency room settings, proof of abstinence in alcohol rehabilitation programmes, driving under influence offenders, workplace testing, assessment of alcohol intake in the context of liver transplantation and foetal alcohol syndrome. Due to their properties, they open up new perspectives for prevention, interdisciplinary cooperation, diagnosis of and therapy for alcohol-related problems.