27 resultados para sensor-Cloud system
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
A rain-on-snow flood occurred in the Bernese Alps, Switzerland, on 10 October 2011, and caused significant damage. As the flood peak was unpredicted by the flood forecast system, questions were raised concerning the causes and the predictability of the event. Here, we aimed to reconstruct the anatomy of this rain-on-snow flood in the Lötschen Valley (160 km2) by analyzing meteorological data from the synoptic to the local scale and by reproducing the flood peak with the hydrological model WaSiM-ETH (Water Flow and Balance Simulation Model). This in order to gain process understanding and to evaluate the predictability. The atmospheric drivers of this rain-on-snow flood were (i) sustained snowfall followed by (ii) the passage of an atmospheric river bringing warm and moist air towards the Alps. As a result, intensive rainfall (average of 100 mm day-1) was accompanied by a temperature increase that shifted the 0° line from 1500 to 3200 m a.s.l. (meters above sea level) in 24 h with a maximum increase of 9 K in 9 h. The south-facing slope of the valley received significantly more precipitation than the north-facing slope, leading to flooding only in tributaries along the south-facing slope. We hypothesized that the reason for this very local rainfall distribution was a cavity circulation combined with a seeder-feeder-cloud system enhancing local rainfall and snowmelt along the south-facing slope. By applying and considerably recalibrating the standard hydrological model setup, we proved that both latent and sensible heat fluxes were needed to reconstruct the snow cover dynamic, and that locally high-precipitation sums (160 mm in 12 h) were required to produce the estimated flood peak. However, to reproduce the rapid runoff responses during the event, we conceptually represent likely lateral flow dynamics within the snow cover causing the model to react "oversensitively" to meltwater. Driving the optimized model with COSMO (Consortium for Small-scale Modeling)-2 forecast data, we still failed to simulate the flood because COSMO-2 forecast data underestimated both the local precipitation peak and the temperature increase. Thus we conclude that this rain-on-snow flood was, in general, predictable, but requires a special hydrological model setup and extensive and locally precise meteorological input data. Although, this data quality may not be achieved with forecast data, an additional model with a specific rain-on-snow configuration can provide useful information when rain-on-snow events are likely to occur.
Resumo:
An odorant's code is represented by activity in a dispersed ensemble of olfactory sensory neurons in the nose, activation of a specific combination of groups of mitral cells in the olfactory bulb and is considered to be mapped at divergent locations in the olfactory cortex. We present here an in vitro model of the mammalian olfactory system developed to gain easy access to all stations of the olfactory pathway. Mouse olfactory epithelial explants are cocultured with a brain slice that includes the olfactory bulb and olfactory cortex areas and maintains the central olfactory pathway intact and functional. Organotypicity of bulb and cortex is preserved and mitral cell axons can be traced to their target areas. Calcium imaging shows propagation of mitral cell activity to the piriform cortex. Long term coculturing with postnatal olfactory epithelial explants restores the peripheral olfactory pathway. Olfactory receptor neurons renew and progressively acquire a mature phenotype. Axons of olfactory receptor neurons grow out of the explant and rewire into the olfactory bulb. The extent of reinnervation exhibits features of a postlesion recovery. Functional imaging confirms the recovery of part of the peripheral olfactory pathway and shows that activity elicited in olfactory receptor neurons or the olfactory nerves is synaptically propagated into olfactory cortex areas. This model is the first attempt to reassemble a sensory system in culture, from the peripheral sensor to the site of cortical representation. It will increase our knowledge on how neuronal circuits in the central olfactory areas integrate sensory input and counterbalance damage.
Resumo:
With research on Wireless Sensor Networks (WSNs) becoming more and more mature in the past five years, researchers from universities all over the world have set up testbeds of wireless sensor networks, in most cases to test and evaluate the real-world behavior of developed WSN protocol mechanisms. Although these testbeds differ heavily in the employed sensor node types and the general architectural set up, they all have similar requirements with respect to management and scheduling functionalities: as every shared resource, a testbed requires a notion of users, resource reservation features, support for reprogramming and reconfiguration of the nodes, provisions to debug and remotely reset sensor nodes in case of node failures, as well as a solution for collecting and storing experimental data. The TARWIS management architecture presented in this paper targets at providing these functionalities independent from node type and node operating system. TARWIS has been designed as a re-usable management solution for research and/or educational oriented research testbeds of wireless sensor networks, relieving researchers intending to deploy a testbed from the burden to implement their own scheduling and testbed management solutions from scratch.
Resumo:
Early warning of future hypoglycemic and hyperglycemic events can improve the safety of type 1 diabetes mellitus (T1DM) patients. The aim of this study is to design and evaluate a hypoglycemia/hyperglycemia early warning system (EWS) for T1DM patients under sensor-augmented pump (SAP) therapy.
Resumo:
A new physics-based technique for correcting inhomogeneities present in sub-daily temperature records is proposed. The approach accounts for changes in the sensor-shield characteristics that affect the energy balance dependent on ambient weather conditions (radiation, wind). An empirical model is formulated that reflects the main atmospheric processes and can be used in the correction step of a homogenization procedure. The model accounts for short- and long-wave radiation fluxes (including a snow cover component for albedo calculation) of a measurement system, such as a radiation shield. One part of the flux is further modulated by ventilation. The model requires only cloud cover and wind speed for each day, but detailed site-specific information is necessary. The final model has three free parameters, one of which is a constant offset. The three parameters can be determined, e.g., using the mean offsets for three observation times. The model is developed using the example of the change from the Wild screen to the Stevenson screen in the temperature record of Basel, Switzerland, in 1966. It is evaluated based on parallel measurements of both systems during a sub-period at this location, which were discovered during the writing of this paper. The model can be used in the correction step of homogenization to distribute a known mean step-size to every single measurement, thus providing a reasonable alternative correction procedure for high-resolution historical climate series. It also constitutes an error model, which may be applied, e.g., in data assimilation approaches.
Resumo:
BACKGROUND: Engineered nanoparticles are becoming increasingly ubiquitous and their toxicological effects on human health, as well as on the ecosystem, have become a concern. Since initial contact with nanoparticles occurs at the epithelium in the lungs (or skin, or eyes), in vitro cell studies with nanoparticles require dose-controlled systems for delivery of nanoparticles to epithelial cells cultured at the air-liquid interface. RESULTS: A novel air-liquid interface cell exposure system (ALICE) for nanoparticles in liquids is presented and validated. The ALICE generates a dense cloud of droplets with a vibrating membrane nebulizer and utilizes combined cloud settling and single particle sedimentation for fast (~10 min; entire exposure), repeatable (<12%), low-stress and efficient delivery of nanoparticles, or dissolved substances, to cells cultured at the air-liquid interface. Validation with various types of nanoparticles (Au, ZnO and carbon black nanoparticles) and solutes (such as NaCl) showed that the ALICE provided spatially uniform deposition (<1.6% variability) and had no adverse effect on the viability of a widely used alveolar human epithelial-like cell line (A549). The cell deposited dose can be controlled with a quartz crystal microbalance (QCM) over a dynamic range of at least 0.02-200 mug/cm(2). The cell-specific deposition efficiency is currently limited to 0.072 (7.2% for two commercially available 6-er transwell plates), but a deposition efficiency of up to 0.57 (57%) is possible for better cell coverage of the exposure chamber. Dose-response measurements with ZnO nanoparticles (0.3-8.5 mug/cm(2)) showed significant differences in mRNA expression of pro-inflammatory (IL-8) and oxidative stress (HO-1) markers when comparing submerged and air-liquid interface exposures. Both exposure methods showed no cellular response below 1 mug/cm(2 )ZnO, which indicates that ZnO nanoparticles are not toxic at occupationally allowed exposure levels. CONCLUSION: The ALICE is a useful tool for dose-controlled nanoparticle (or solute) exposure of cells at the air-liquid interface. Significant differences between cellular response after ZnO nanoparticle exposure under submerged and air-liquid interface conditions suggest that pharmaceutical and toxicological studies with inhaled (nano-)particles should be performed under the more realistic air-liquid interface, rather than submerged cell conditions.
Resumo:
Recent advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing environmental conditions and number of users, application performance might suffer, leading to Service Level Agreement (SLA) violations and inefficient use of hardware resources. We introduce a system for controlling the complexity of scaling applications composed of multiple services using mechanisms based on fulfillment of SLAs. We present how service monitoring information can be used in conjunction with service level objectives, predictions, and correlations between performance indicators for optimizing the allocation of services belonging to distributed applications. We validate our models using experiments and simulations involving a distributed enterprise information system. We show how discovering correlations between application performance indicators can be used as a basis for creating refined service level objectives, which can then be used for scaling the application and improving the overall application's performance under similar conditions.
Resumo:
HYPOTHESIS A previously developed image-guided robot system can safely drill a tunnel from the lateral mastoid surface, through the facial recess, to the middle ear, as a viable alternative to conventional mastoidectomy for cochlear electrode insertion. BACKGROUND Direct cochlear access (DCA) provides a minimally invasive tunnel from the lateral surface of the mastoid through the facial recess to the middle ear for cochlear electrode insertion. A safe and effective tunnel drilled through the narrow facial recess requires a highly accurate image-guided surgical system. Previous attempts have relied on patient-specific templates and robotic systems to guide drilling tools. In this study, we report on improvements made to an image-guided surgical robot system developed specifically for this purpose and the resulting accuracy achieved in vitro. MATERIALS AND METHODS The proposed image-guided robotic DCA procedure was carried out bilaterally on 4 whole head cadaver specimens. Specimens were implanted with titanium fiducial markers and imaged with cone-beam CT. A preoperative plan was created using a custom software package wherein relevant anatomical structures of the facial recess were segmented, and a drill trajectory targeting the round window was defined. Patient-to-image registration was performed with the custom robot system to reference the preoperative plan, and the DCA tunnel was drilled in 3 stages with progressively longer drill bits. The position of the drilled tunnel was defined as a line fitted to a point cloud of the segmented tunnel using principle component analysis (PCA function in MatLab). The accuracy of the DCA was then assessed by coregistering preoperative and postoperative image data and measuring the deviation of the drilled tunnel from the plan. The final step of electrode insertion was also performed through the DCA tunnel after manual removal of the promontory through the external auditory canal. RESULTS Drilling error was defined as the lateral deviation of the tool in the plane perpendicular to the drill axis (excluding depth error). Errors of 0.08 ± 0.05 mm and 0.15 ± 0.08 mm were measured on the lateral mastoid surface and at the target on the round window, respectively (n =8). Full electrode insertion was possible for 7 cases. In 1 case, the electrode was partially inserted with 1 contact pair external to the cochlea. CONCLUSION The purpose-built robot system was able to perform a safe and reliable DCA for cochlear implantation. The workflow implemented in this study mimics the envisioned clinical procedure showing the feasibility of future clinical implementation.
Resumo:
We describe a system for performing SLA-driven management and orchestration of distributed infrastructures composed of services supporting mobile computing use cases. In particular, we focus on a Follow-Me Cloud scenario in which we consider mobile users accessing cloud-enable services. We combine a SLA-driven approach to infrastructure optimization, with forecast-based performance degradation preventive actions and pattern detection for supporting mobile cloud infrastructure management. We present our system's information model and architecture including the algorithmic support and the proposed scenarios for system evaluation.
Resumo:
The intention of an authentication and authorization infrastructure (AAI) is to simplify and unify access to different web resources. With a single login, a user can access web applications at multiple organizations. The Shibboleth authentication and authorization infrastructure is a standards-based, open source software package for web single sign-on (SSO) across or within organizational boundaries. It allows service providers to make fine-grained authorization decisions for individual access of protected online resources. The Shibboleth system is a widely used AAI, but only supports protection of browser-based web resources. We have implemented a Shibboleth AAI extension to protect web services using Simple Object Access Protocol (SOAP). Besides user authentication for browser-based web resources, this extension also provides user and machine authentication for web service-based resources. Although implemented for a Shibboleth AAI, the architecture can be easily adapted to other AAIs.
Resumo:
Virtualisation of cellular networks can be seen as a way to significantly reduce the complexity of processes, required nowadays to provide reliable cellular networks. The Future Communication Architecture for Mobile Cloud Services: Mobile Cloud Networking (MCN) is a EU FP7 Large-scale Integrating Project (IP) funded by the European Commission that is focusing on cloud computing concepts to achieve virtualisation of cellular networks. It aims at the development of a fully cloud-based mobile communication and application platform, or more specifically, it aims to investigate, implement and evaluate the technological foundations for the mobile communication system of Long Term Evolution (LTE), based on Mobile Network plus Decentralized Computing plus Smart Storage offered as one atomic service: On-Demand, Elastic and Pay-As-You-Go. This paper provides a brief overview of the MCN project and discusses the challenges that need to be solved.
Resumo:
Derivation of probability estimates complementary to geophysical data sets has gained special attention over the last years. Information about a confidence level of provided physical quantities is required to construct an error budget of higher-level products and to correctly interpret final results of a particular analysis. Regarding the generation of products based on satellite data a common input consists of a cloud mask which allows discrimination between surface and cloud signals. Further the surface information is divided between snow and snow-free components. At any step of this discrimination process a misclassification in a cloud/snow mask propagates to higher-level products and may alter their usability. Within this scope a novel probabilistic cloud mask (PCM) algorithm suited for the 1 km × 1 km Advanced Very High Resolution Radiometer (AVHRR) data is proposed which provides three types of probability estimates between: cloudy/clear-sky, cloudy/snow and clear-sky/snow conditions. As opposed to the majority of available techniques which are usually based on the decision-tree approach in the PCM algorithm all spectral, angular and ancillary information is used in a single step to retrieve probability estimates from the precomputed look-up tables (LUTs). Moreover, the issue of derivation of a single threshold value for a spectral test was overcome by the concept of multidimensional information space which is divided into small bins by an extensive set of intervals. The discrimination between snow and ice clouds and detection of broken, thin clouds was enhanced by means of the invariant coordinate system (ICS) transformation. The study area covers a wide range of environmental conditions spanning from Iceland through central Europe to northern parts of Africa which exhibit diverse difficulties for cloud/snow masking algorithms. The retrieved PCM cloud classification was compared to the Polar Platform System (PPS) version 2012 and Moderate Resolution Imaging Spectroradiometer (MODIS) collection 6 cloud masks, SYNOP (surface synoptic observations) weather reports, Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) vertical feature mask version 3 and to MODIS collection 5 snow mask. The outcomes of conducted analyses proved fine detection skills of the PCM method with results comparable to or better than the reference PPS algorithm.
Resumo:
The clinical demand for a device to monitor Blood Pressure (BP) in ambulatory scenarios with minimal use of inflation cuffs is increasing. Based on the so-called Pulse Wave Velocity (PWV) principle, this paper introduces and evaluates a novel concept of BP monitor that can be fully integrated within a chest sensor. After a preliminary calibration, the sensor provides non-occlusive beat-by-beat estimations of Mean Arterial Pressure (MAP) by measuring the Pulse Transit Time (PTT) of arterial pressure pulses travelling from the ascending aorta towards the subcutaneous vasculature of the chest. In a cohort of 15 healthy male subjects, a total of 462 simultaneous readings consisting of reference MAP and chest PTT were acquired. Each subject was recorded at three different days: D, D+3 and D+14. Overall, the implemented protocol induced MAP values to range from 80 ± 6 mmHg in baseline, to 107 ± 9 mmHg during isometric handgrip maneuvers. Agreement between reference and chest-sensor MAP values was tested by using intraclass correlation coefficient (ICC = 0.78) and Bland-Altman analysis (mean error = 0.7 mmHg, standard deviation = 5.1 mmHg). The cumulative percentage of MAP values provided by the chest sensor falling within a range of ±5 mmHg compared to reference MAP readings was of 70%, within ±10 mmHg was of 91%, and within ±15mmHg was of 98%. These results point at the fact that the chest sensor complies with the British Hypertension Society (BHS) requirements of Grade A BP monitors, when applied to MAP readings. Grade A performance was maintained even two weeks after having performed the initial subject-dependent calibration. In conclusion, this paper introduces a sensor and a calibration strategy to perform MAP measurements at the chest. The encouraging performance of the presented technique paves the way towards an ambulatory-compliant, continuous and non-occlusive BP monitoring system.