949 resultados para time monitoring
Resumo:
Sandy coasts represent vital areas whose preservation and maintenance also involve economic and tourist interests. Besides, these dynamic environments undergo the erosion process at different levels depending on their specific characteristics. For this reason, defence interventions are commonly realized by combining engineering solutions and management policies to evaluate their effects over time. Monitoring activities represent the fundamental instrument to obtain a deep knowledge of the investigated phenomenon. Thanks to technological development, several possibilities both in terms of geomatic surveying techniques and processing tools are available, allowing to reach high performances and accuracy. Nevertheless, when the littoral definition includes both emerged and submerged beaches, several issues have to be considered. Therefore, the geomatic surveys and all the following steps need to be calibrated according to the individual application, with the reference system, accuracy and spatial resolution as primary aspects. This study provides the evaluation of the available geomatic techniques, processing approaches, and derived products, aiming at optimising the entire workflow of coastal monitoring by adopting an accuracy-efficiency trade-off. The presented analyses highlight the balance point when the increase in performance becomes an additional value for the obtained products ensuring proper data management. This perspective can represent a helpful instrument to properly plan the monitoring activities according to the specific purposes of the analysis. Finally, the primary uses of the acquired and processed data in monitoring contexts are presented, also considering possible applications for numerical modelling as supporting tools. Moreover, the theme of coastal monitoring has been addressed throughout this thesis by considering a practical point of view, linking to the activities performed by Arpae (Regional agency for prevention, environment and energy of Emilia-Romagna). Indeed, the Adriatic coast of Emilia-Romagna, where sandy beaches particularly exposed to erosion are present, has been chosen as a case study for all the analyses and considerations.
Resumo:
With the aim of heading towards a more sustainable future, there has been a noticeable increase in the installation of Renewable Energy Sources (RES) in power systems in the latest years. Besides the evident environmental benefits, RES pose several technological challenges in terms of scheduling, operation, and control of transmission and distribution power networks. Therefore, it raised the necessity of developing smart grids, relying on suitable distributed measurement infrastructure, for instance, based on Phasor Measurement Units (PMUs). Not only are such devices able to estimate a phasor, but they can also provide time information which is essential for real-time monitoring. This Thesis falls within this context by analyzing the uncertainty requirements of PMUs in distribution and transmission applications. Concerning the latter, the reliability of PMU measurements during severe power system events is examined, whereas for the first, typical configurations of distribution networks are studied for the development of target uncertainties. The second part of the Thesis, instead, is dedicated to the application of PMUs in low-inertia power grids. The replacement of traditional synchronous machines with inertia-less RES is progressively reducing the overall system inertia, resulting in faster and more severe events. In this scenario, PMUs may play a vital role in spite of the fact that no standard requirements nor target uncertainties are yet available. This Thesis deeply investigates PMU-based applications, by proposing a new inertia index relying only on local measurements and evaluating their reliability in low-inertia scenarios. It also develops possible uncertainty intervals based on the electrical instrumentation currently used in power systems and assesses the interoperability with other devices before and after contingency events.
Resumo:
Riding the wave of recent groundbreaking achievements, artificial intelligence (AI) is currently the buzzword on everybody’s lips and, allowing algorithms to learn from historical data, Machine Learning (ML) emerged as its pinnacle. The multitude of algorithms, each with unique strengths and weaknesses, highlights the absence of a universal solution and poses a challenging optimization problem. In response, automated machine learning (AutoML) navigates vast search spaces within minimal time constraints. By lowering entry barriers, AutoML emerged as promising the democratization of AI, yet facing some challenges. In data-centric AI, the discipline of systematically engineering data used to build an AI system, the challenge of configuring data pipelines is rather simple. We devise a methodology for building effective data pre-processing pipelines in supervised learning as well as a data-centric AutoML solution for unsupervised learning. In human-centric AI, many current AutoML tools were not built around the user but rather around algorithmic ideas, raising ethical and social bias concerns. We contribute by deploying AutoML tools aiming at complementing, instead of replacing, human intelligence. In particular, we provide solutions for single-objective and multi-objective optimization and showcase the challenges and potential of novel interfaces featuring large language models. Finally, there are application areas that rely on numerical simulators, often related to earth observations, they tend to be particularly high-impact and address important challenges such as climate change and crop life cycles. We commit to coupling these physical simulators with (Auto)ML solutions towards a physics-aware AI. Specifically, in precision farming, we design a smart irrigation platform that: allows real-time monitoring of soil moisture, predicts future moisture values, and estimates water demand to schedule the irrigation.
Resumo:
There are many deformable objects such as papers, clothes, ropes in a person’s living space. To have a robot working in automating the daily tasks it is important that the robot works with these deformable objects. Manipulation of deformable objects is a challenging task for robots because these objects have an infinite-dimensional configuration space and are expensive to model, making real-time monitoring, planning and control difficult. It forms a particularly important field of robotics with relevant applications in different sectors such as medicine, food handling, manufacturing, and household chores. In this report, there is a clear review of the approaches used and are currently in use along with future developments to achieve this task. My research is more focused on the last 10 years, where I have systematically reviewed many articles to have a clear understanding of developments in this field. The main contribution is to show the whole landscape of this concept and provide a broad view of how it has evolved. I also explained my research methodology by following my analysis from the past to the present along with my thoughts for the future.
Resumo:
The use of remote sensing is necessary for monitoring forest carbon stocks at large scales. Optical remote sensing, although not the most suitable technique for the direct estimation of stand biomass, offers the advantage of providing large temporal and spatial datasets. In particular, information on canopy structure is encompassed in stand reflectance time series. This study focused on the example of Eucalyptus forest plantations, which have recently attracted much attention as a result of their high expansion rate in many tropical countries. Stand scale time-series of Normalized Difference Vegetation Index (NDVI) were obtained from MODIS satellite data after a procedure involving un-mixing and interpolation, on about 15,000 ha of plantations in southern Brazil. The comparison of the planting date of the current rotation (and therefore the age of the stands) estimated from these time series with real values provided by the company showed that the root mean square error was 35.5 days. Age alone explained more than 82% of stand wood volume variability and 87% of stand dominant height variability. Age variables were combined with other variables derived from the NDVI time series and simple bioclimatic data by means of linear (Stepwise) or nonlinear (Random Forest) regressions. The nonlinear regressions gave r-square values of 0.90 for volume and 0.92 for dominant height, and an accuracy of about 25 m(3)/ha for volume (15% of the volume average value) and about 1.6 m for dominant height (8% of the height average value). The improvement including NDVI and bioclimatic data comes from the fact that the cumulative NDVI since planting date integrates the interannual variability of leaf area index (LAI), light interception by the foliage and growth due for example to variations of seasonal water stress. The accuracy of biomass and height predictions was strongly improved by using the NDVI integrated over the two first years after planting, which are critical for stand establishment. These results open perspectives for cost-effective monitoring of biomass at large scales in intensively-managed plantation forests. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Three experiments explored the effectiveness of continuous auditory displays, or sonifications, for conveying information about a simulated anesthetized patient's respiration. Experiment 1 established an effective respiratory sonification. Experiment 2 showed an effect of expertise in the use of respiratory sonification and revealed that some apparent differences in sonification effectiveness could be accounted for by response bias. Experiment 3 showed that sonification helps anesthesiologists to maintain high levels of awareness of the simulated patient's state while performing other tasks more effectively than when relying upon visual monitoring of the simulated patient state. Overall, sonification of patient physiology beyond traditional pulse oximetry appears to be a viable and useful adjunct to visual monitors. Actual and potential applications of this research include monitoring in a wide variety of busy critical care contexts.
Resumo:
Quartz Crystal Microbalance (QCM) was used to monitor the mass changes on a quartz crystal surface containing immobilized lectins that interacted with carbohydrates. The strategy for lectin immobilization was developed on the basis of a multilayer system composed of Au-cystamine-glutaraldehyde-lectin. Each step of the immobilization procedure was confirmed by FTIR analysis. The system was used to study the interactions of Concanavalin A (ConA) with maltose and Jacalin with Fetuin. The real-time binding of different concentrations of carbohydrate to the immobilized lectin was monitored by means of QCM measurements and the data obtained allowed for the construction of Langmuir isotherm curves. The association constants determined for the specific interactions analyzed here were (6.4 +/- 0.2) X 10(4) M-1 for Jacalin-Fetuin and (4.5 +/- 0.1) x 10(2) M-1 for ConA-maltose. These results indicate that the QCM constitutes a suitable method for the analysis of lectin-carbohydrate interactions, even when assaying low molecular mass ligands such as disaccharides. Published by Elsevier B.V.
Resumo:
Os serviços baseados em localização vieram dar um novo alento à criatividade dos programadores de aplicações móveis. A vulgarização de dispositivos com capacidades de localização integradas deu origem ao desenvolvimento de aplicações que gerem e apresentam informação baseada na posição do utilizador. Desde então, o mercado móvel tem assistido ao aparecimento de novas categorias de aplicações que tiram proveito desta capacidade. Entre elas, destaca-se a monitorização remota de dispositivos, que tem vindo a assumir uma importância crescente, tanto no sector particular como no sector empresarial. Esta dissertação começa por apresentar o estado da arte sobre os diferentes sistemas de posicionamento, categorizados pela sua eficácia em ambientes internos ou externos, assim como diferentes protocolos de comunicação em tempo quase-real. É também feita uma análise ao estado actual do mercado móvel. Actualmente o mercado possui diferentes plataformas móveis com características únicas que as fazem rivalizar entre si, com vista a expandirem a sua quota de mercado. É por isso elaborado um breve estudo sobre os sistemas operativos móveis mais relevantes da actualidade. É igualmente feita uma abordagem mais profunda à arquitectura da plataforma móvel da Apple - o iOS – que serviu de base ao desenvolvimento de uma solução optimizada para localização e monitorização de dispositivos móveis. A monitorização implica uma utilização intensiva de recursos energéticos e de largura de banda que os dispositivos móveis da actualidade não estão aptos a suportar. Dado o grande consumo energético do GPS face à precária autonomia destes dispositivos, é apresentado um estudo em que se expõem soluções que permitem gerir de forma optimizada a utilização do GPS. O elevado custo dos planos de dados facultados pelas operadoras móveis é também considerado, pelo que são exploradas soluções que visam minimizar a utilização de largura de banda. Deste trabalho, nasce a aplicação EyeGotcha, que para além de permitir localizar outros utilizadores de dispositivos móveis de forma optimizada, permite também monitorizar as suas acções baseando-se num conjunto de regras pré-definidas. Estas acções são reportadas às entidades monitoras, de modo automatizado e sob a forma de alertas. Visionando-se a comercialização da aplicação, é portanto apresentado um modelo de negócio que permite obter receitas capazes de cobrirem os custos de manutenção de serviços, aos quais o funcionamento da aplicação móvel está subjugado.
Resumo:
Monitoring is a very important aspect to consider when developing real-time systems. However, it is also important to consider the impact of the monitoring mechanisms in the actual application. The use of Reflection can provide a clear separation between the real-time application and the implemented monitoring mechanisms, which can be introduced (reflected) into the underlying system without changing the actual application part of the code. Nevertheless, controlling the monitoring system itself is still a topic of research. The monitoring mechanisms must contain knowledge about “how to get the information out”. Therefore, this paper presents the ongoing work to define a suitable strategy for monitoring real-time systems through the use of Reflection.
Resumo:
Wireless sensor networks (WSNs) have attracted growing interest in the last decade as an infrastructure to support a diversity of ubiquitous computing and cyber-physical systems. However, most research work has focused on protocols or on specific applications. As a result, there remains a clear lack of effective and usable WSN system architectures that address both functional and non-functional requirements in an integrated fashion. This poster outlines the EMMON system architecture for large-scale, dense, real-time embedded monitoring. It provides a hierarchical communication architecture together with integrated middleware and command and control software. It has been designed to maintain as much as flexibility as possible while meeting specific applications requirements. EMMON has been validated through extensive analytical, simulation and experimental evaluations, including through a 300+ nodes test-bed the largest single-site WSN test-bed in Europe.
Resumo:
Wireless sensor networks (WSNs) have attracted growing interest in the last decade as an infrastructure to support a diversity of ubiquitous computing and cyber-physical systems. However, most research work has focused on protocols or on specific applications. As a result, there remains a clear lack of effective, feasible and usable system architectures that address both functional and non-functional requirements in an integrated fashion. In this paper, we outline the EMMON system architecture for large-scale, dense, real-time embedded monitoring. EMMON provides a hierarchical communication architecture together with integrated middleware and command and control software. It has been designed to use standard commercially-available technologies, while maintaining as much flexibility as possible to meet specific applications requirements. The EMMON architecture has been validated through extensive simulation and experimental evaluation, including a 300+ node test-bed, which is, to the best of our knowledge, the largest single-site WSN test-bed in Europe to date.
Resumo:
Monitoring, object-orientation, real-time, execution-time, scheduling
Resumo:
Magdeburg, Univ., Fak. für Informatik, Diss., 2015
Resumo:
Monitoring of T-cell responses in genital mucosa has remained a major challenge because of the absence of lymphoid aggregates and the low abundance of T cells. Here we have adapted to genital tissue a sensitive real-time reverse transcription-PCR (TaqMan) method to measure induction of gamma interferon (IFN-gamma) mRNA transcription after 3 h of antigen-specific activation of CD8 T cells. For this purpose, we vaccinated C57BL/6 mice subcutaneously with human papillomavirus type 16 L1 virus-like particles and monitored the induction of CD8 T cells specific to the L1(165-173) H-2D(b)-restricted epitope. Comparison of the responses induced in peripheral blood mononuclear cells and lymph nodes (LN) by L1-specific IFN-gamma enzyme-linked immunospot assay and TaqMan determination of the relative increase in L1-specific IFN-gamma mRNA induction normalized to the content of CD8b mRNA showed a significant correlation, despite the difference in the readouts. Most of the cervicovaginal tissues could be analyzed by the TaqMan method if normalization to glyceraldehyde-3-phosphate dehydrogenase mRNA was used and a significant L1-specific IFN-gamma induction was found in one-third of the immunized mice. This local response did not correlate with the immune responses measured in the periphery, with the exception of the sacral LN, an LN draining the genital mucosa, where a significant correlation was found. Our data show that the TaqMan method is sensitive enough to detect antigen-specific CD8 T-cell responses in the genital mucosa of individual mice, and this may contribute to elaborate effective vaccines against genital pathogens.
Resumo:
A prospective cross-over study was performed in a general practice environment to assess and compare compliance data obtained by electronic monitoring on a BID or QD regimen in 113 patients with hypertension or angina pectoris. All patients were on a BID regimen (nifedipine SR) during the first month and switched to QD regimen (amlodipine) for another month. Taking compliance (i.e. the proportion of days with correct dosing) improved in 30% of patients (95% confidence interval 19 to 41%, p < 0.001), when switching from a BID to a QD regimen, but at the same time there was a 15% increase (95% confidence interval 5 to 25%, p < 0.02) in the number of patients with one or more no-dosing days. About 8% of patients had a low compliance rate, irrespective of the dosage regimen. Actual dosage intervals were used to estimate extent and timing of periods with unsatisfactory drug activity for various hypothetical drug durations of action, and it appears that the apparent advantage of QD regimen in terms of compliance is clinically meaningful only, when the duration of activity extents beyond the dosage interval in all patients.