928 resultados para Discrete-time control
Resumo:
An array of Bio-Argo floats equipped with radiometric sensors has been recently deployed in various open ocean areas representative of the diversity of trophic and bio-optical conditions prevailing in the so-called Case 1 waters. Around solar noon and almost everyday, each float acquires 0-250 m vertical profiles of Photosynthetically Available Radiation and downward irradiance at three wavelengths (380, 412 and 490 nm). Up until now, more than 6500 profiles for each radiometric channel have been acquired. As these radiometric data are collected out of operator’s control and regardless of meteorological conditions, specific and automatic data processing protocols have to be developed. Here, we present a data quality-control procedure aimed at verifying profile shapes and providing near real-time data distribution. This procedure is specifically developed to: 1) identify main issues of measurements (i.e. dark signal, atmospheric clouds, spikes and wave-focusing occurrences); 2) validate the final data with a hierarchy of tests to ensure a scientific utilization. The procedure, adapted to each of the four radiometric channels, is designed to flag each profile in a way compliant with the data management procedure used by the Argo program. Main perturbations in the light field are identified by the new protocols with good performances over the whole dataset. This highlights its potential applicability at the global scale. Finally, the comparison with modeled surface irradiances allows assessing the accuracy of quality-controlled measured irradiance values and identifying any possible evolution over the float lifetime due to biofouling and instrumental drift.
Resumo:
An array of Bio-Argo floats equipped with radiometric sensors has been recently deployed in various open ocean areas representative of the diversity of trophic and bio-optical conditions prevailing in the so-called Case 1 waters. Around solar noon and almost everyday, each float acquires 0-250 m vertical profiles of Photosynthetically Available Radiation and downward irradiance at three wavelengths (380, 412 and 490 nm). Up until now, more than 6500 profiles for each radiometric channel have been acquired. As these radiometric data are collected out of operator’s control and regardless of meteorological conditions, specific and automatic data processing protocols have to be developed. Here, we present a data quality-control procedure aimed at verifying profile shapes and providing near real-time data distribution. This procedure is specifically developed to: 1) identify main issues of measurements (i.e. dark signal, atmospheric clouds, spikes and wave-focusing occurrences); 2) validate the final data with a hierarchy of tests to ensure a scientific utilization. The procedure, adapted to each of the four radiometric channels, is designed to flag each profile in a way compliant with the data management procedure used by the Argo program. Main perturbations in the light field are identified by the new protocols with good performances over the whole dataset. This highlights its potential applicability at the global scale. Finally, the comparison with modeled surface irradiances allows assessing the accuracy of quality-controlled measured irradiance values and identifying any possible evolution over the float lifetime due to biofouling and instrumental drift.
Resumo:
BACKGROUND: The Philippines has a population of approximately 103 million people, of which 6.7 million live in schistosomiasis-endemic areas with 1.8 million people being at risk of infection with Schistosoma japonicum. Although the country-wide prevalence of schistosomiasis japonica in the Philippines is relatively low, the prevalence of schistosomiasis can be high, approaching 65% in some endemic areas. Of the currently available microscopy-based diagnostic techniques for detecting schistosome infections in the Philippines and elsewhere, most exhibit varying diagnostic performances, with the Kato-Katz (KK) method having particularly poor sensitivity for detecting low intensity infections. This suggests that the actual prevalence of schistosomiasis japonica may be much higher than previous reports have indicated.
METHODOLOGY/PRINCIPAL FINDINGS: Six barangay (villages) were selected to determine the prevalence of S. japonicum in humans in the municipality of Palapag, Northern Samar. Fecal samples were collected from 560 humans and examined by the KK method and a validated real-time PCR (qPCR) assay. A high S. japonicum prevalence (90.2%) was revealed using qPCR whereas the KK method indicated a lower prevalence (22.9%). The geometric mean eggs per gram (GMEPG) determined by the qPCR was 36.5 and 11.5 by the KK. These results, particularly those obtained by the qPCR, indicate that the prevalence of schistosomiasis in this region of the Philippines is much higher than historically reported.
CONCLUSIONS/SIGNIFICANCE: Despite being more expensive, qPCR can complement the KK procedure, particularly for surveillance and monitoring of areas where extensive schistosomiasis control has led to low prevalence and intensity infections and where schistosomiasis elimination is on the horizon, as for example in southern China.
Resumo:
As the development of a viable quantum computer nears, existing widely used public-key cryptosystems, such as RSA, will no longer be secure. Thus, significant effort is being invested into post-quantum cryptography (PQC). Lattice-based cryptography (LBC) is one such promising area of PQC, which offers versatile, efficient, and high performance security services. However, the vulnerabilities of these implementations against side-channel attacks (SCA) remain significantly understudied. Most, if not all, lattice-based cryptosystems require noise samples generated from a discrete Gaussian distribution, and a successful timing analysis attack can render the whole cryptosystem broken, making the discrete Gaussian sampler the most vulnerable module to SCA. This research proposes countermeasures against timing information leakage with FPGA-based designs of the CDT-based discrete Gaussian samplers with constant response time, targeting encryption and signature scheme parameters. The proposed designs are compared against the state-of-the-art and are shown to significantly outperform existing implementations. For encryption, the proposed sampler is 9x faster in comparison to the only other existing time-independent CDT sampler design. For signatures, the first time-independent CDT sampler in hardware is proposed.
Resumo:
With the construction of operational oceanography systems, the need for real-time has become more and more important. A lot of work had been done in the past, within National Data Centres (NODC) and International Oceanographic Data and Information Exchange (IODE) to standardise delayed mode quality control procedures. Concerning such quality control procedures applicable in real-time (within hours to a maximum of a week from acquisition), which means automatically, some recommendations were set up for physical parameters but mainly within projects without consolidation with other initiatives. During the past ten years the EuroGOOS community has been working on such procedures within international programs such as Argo, OceanSites or GOSUD, or within EC projects such as Mersea, MFSTEP, FerryBox, ECOOP, and MyOcean. In collaboration with the FP7 SeaDataNet project that is standardizing the delayed mode quality control procedures in NODCs, and MyOcean GMES FP7 project that is standardizing near real time quality control procedures for operational oceanography purposes, the DATA-MEQ working group decided to put together this document to summarize the recommendations for near real-time QC procedures that they judged mature enough to be advertised and recommended to EuroGOOS.
Resumo:
Automotive producers are aiming to make their order fulfilment processes more flexible. Opening the pipeline of planned products for dynamic allocation to dealers/ customers is a significant step to be more flexible but the behaviour of such Virtual-Build-To-Order systems are complex to predict and their performance varies significantly as product variety levels change. This study investigates the potential for intelligent control of the pipeline feed, taking into account the current status of inventory (level and mix) and of the volume and mix of unsold products in the planning pipeline, as well as the demand profile. Five ‘intelligent’ methods for selecting the next product to be planned into the production pipeline are analysed using a discrete event simulation model and compared to the unintelligent random feed. The methods are tested under two conditions, firstly when customers must be fulfilled with the exact product they request, and secondly when customers trade-off a shorter waiting time for compromise in specification. The two forms of customer behaviour have a substantial impact on the performance of the methods and there are also significant differences between the methods themselves. When the producer has an accurate model of customer demand, methods that attempt to harmonise the mix in the system to the demand distribution are superior.
Resumo:
215 p.
Resumo:
This paper presents a new tuning methodology of the main controller of an internal model control structure for n×n stable multivariable processes with multiple time delays based on the centralized inverted decoupling structure. Independently of the system size, very simple general expressions for the controller elements are obtained. The realizability conditions are provided and the specification of the closed-loop requirements is explained. A diagonal filter is added to the proposed control structure in order to improve the disturbance rejection without modifying the nominal set-point response. The effectiveness of the method is illustrated through different simulation examples in comparison with other works.
H-infinity control design for time-delay linear systems: a rational transfer function based approach
Resumo:
The aim of this paper is to present new results on H-infinity control synthesis for time-delay linear systems. We extend the use of a finite order LTI system, called comparison system to H-infinity analysis and design. Differently from what can be viewed as a common feature of other control design methods available in the literature to date, the one presented here treats time-delay systems control design with classical numeric routines based on Riccati equations arisen from H-infinity theory. The proposed algorithm is simple, efficient and easy to implement. Some examples illustrating state and output feedback design are solved and discussed in order to put in evidence the most relevant characteristic of the theoretical results. Moreover, a practical application involving a 3-DOF networked control system is presented.
Resumo:
In April 2017, CMEMS plans to launch the WAVES NRT products. This document is focused in the automatic RTQC of the collected wave data. The validation procedure includes the delayed mode quality control of the data and will be specified in another guideline. To perform any kind of quality control to wave data, first it’s necessary to know the nature of the measurements and the analysis performed to those measurements to obtain the wave parameters. For that reason next chapter is dedicated to show the usual wave analysis and the different parameters and estimators obtained.
Resumo:
International audience
Resumo:
When it comes to information sets in real life, often pieces of the whole set may not be available. This problem can find its origin in various reasons, describing therefore different patterns. In the literature, this problem is known as Missing Data. This issue can be fixed in various ways, from not taking into consideration incomplete observations, to guessing what those values originally were, or just ignoring the fact that some values are missing. The methods used to estimate missing data are called Imputation Methods. The work presented in this thesis has two main goals. The first one is to determine whether any kind of interactions exists between Missing Data, Imputation Methods and Supervised Classification algorithms, when they are applied together. For this first problem we consider a scenario in which the databases used are discrete, understanding discrete as that it is assumed that there is no relation between observations. These datasets underwent processes involving different combina- tions of the three components mentioned. The outcome showed that the missing data pattern strongly influences the outcome produced by a classifier. Also, in some of the cases, the complex imputation techniques investigated in the thesis were able to obtain better results than simple ones. The second goal of this work is to propose a new imputation strategy, but this time we constrain the specifications of the previous problem to a special kind of datasets, the multivariate Time Series. We designed new imputation techniques for this particular domain, and combined them with some of the contrasted strategies tested in the pre- vious chapter of this thesis. The time series also were subjected to processes involving missing data and imputation to finally propose an overall better imputation method. In the final chapter of this work, a real-world example is presented, describing a wa- ter quality prediction problem. The databases that characterized this problem had their own original latent values, which provides a real-world benchmark to test the algorithms developed in this thesis.
Resumo:
Objectifs: Examiner les tendances temporelles, les déterminants en lien avec le design des études et la qualité des taux de réponse rapportés dans des études cas-témoins sur le cancer publiées lors des 30 dernières années. Méthodes: Une revue des études cas-témoins sur le cancer a été menée. Les critères d'inclusion étaient la publication (i) dans l’un de 15 grands périodiques ciblés et (ii) lors de quatre périodes de publication (1984-1986, 1995, 2005 et 2013) couvrant trois décennies. 370 études ont été sélectionnées et examinées. La méthodologie en lien avec le recrutement des sujets et la collecte de données, les caractéristiques de la population, les taux de participation et les raisons de la non-participation ont été extraites de ces études. Des statistiques descriptives ont été utilisées pour résumer la qualité des taux de réponse rapportés (en fonction de la quantité d’information disponible), les tendances temporelles et les déterminants des taux de réponse; des modèles de régression linéaire ont été utilisés pour analyser les tendances temporelles et les déterminants des taux de participation. Résultats: Dans l'ensemble, les qualités des taux de réponse rapportés et des raisons de non-participation étaient très faible, particulièrement chez les témoins. La participation a diminué au cours des 30 dernières années, et cette baisse est plus marquée dans les études menées après 2000. Lorsque l'on compare les taux de réponse dans les études récentes a ceux des études menées au cours de 1971 à 1980, il y a une plus grande baisse chez les témoins sélectionnés en population générale ( -17,04%, IC 95%: -23,17%, -10,91%) que chez les cas (-5,99%, IC 95%: -11,50%, -0,48%). Les déterminants statistiquement significatifs du taux de réponse chez les cas étaient: le type de cancer examiné, la localisation géographique de la population de l'étude, et le mode de collecte des données. Le seul déterminant statistiquement significatif du taux de réponse chez les témoins hospitaliers était leur localisation géographique. Le seul déterminant statistiquement significatif du taux de participation chez les témoins sélectionnés en population générale était le type de répondant (sujet uniquement ou accompagné d’une tierce personne). Conclusion: Le taux de participation dans les études cas-témoins sur le cancer semble avoir diminué au cours des 30 dernières années et cette baisse serait plus marquée dans les études récentes. Afin d'évaluer le niveau réel de non-participation et ses déterminants, ainsi que l'impact de la non-participation sur la validité des études, il est nécessaire que les études publiées utilisent une approche normalisée pour calculer leurs taux de participation et qu’elles rapportent ceux-ci de façon transparente.
Resumo:
In this report, we develop an intelligent adaptive neuro-fuzzy controller by using adaptive neuro fuzzy inference system (ANFIS) techniques. We begin by starting with a standard proportional-derivative (PD) controller and use the PD controller data to train the ANFIS system to develop a fuzzy controller. We then propose and validate a method to implement this control strategy on commercial off-the-shelf (COTS) hardware. An analysis is made into the choice of filters for attitude estimation. These choices are limited by the complexity of the filter and the computing ability and memory constraints of the micro-controller. Simplified Kalman filters are found to be good at estimation of attitude given the above constraints. Using model based design techniques, the models are implemented on an embedded system. This enables the deployment of fuzzy controllers on enthusiast-grade controllers. We evaluate the feasibility of the proposed control strategy in a model-in-the-loop simulation. We then propose a rapid prototyping strategy, allowing us to deploy these control algorithms on a system consisting of a combination of an ARM-based microcontroller and two Arduino-based controllers. We then use a combination of the code generation capabilities within MATLAB/Simulink in combination with multiple open-source projects in order to deploy code to an ARM CortexM4 based controller board. We also evaluate this strategy on an ARM-A8 based board, and a much less powerful Arduino based flight controller. We conclude by proving the feasibility of fuzzy controllers on Commercial-off the shelf (COTS) hardware, we also point out the limitations in the current hardware and make suggestions for hardware that we think would be better suited for memory heavy controllers.