794 resultados para Estimations


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new control scheme has been presented in this thesis. Based on the NonLinear Geometric Approach, the proposed Active Control System represents a new way to see the reconfigurable controllers for aerospace applications. The presence of the Diagnosis module (providing the estimation of generic signals which, based on the case, can be faults, disturbances or system parameters), mean feature of the depicted Active Control System, is a characteristic shared by three well known control systems: the Active Fault Tolerant Controls, the Indirect Adaptive Controls and the Active Disturbance Rejection Controls. The standard NonLinear Geometric Approach (NLGA) has been accurately investigated and than improved to extend its applicability to more complex models. The standard NLGA procedure has been modified to take account of feasible and estimable sets of unknown signals. Furthermore the application of the Singular Perturbations approximation has led to the solution of Detection and Isolation problems in scenarios too complex to be solved by the standard NLGA. Also the estimation process has been improved, where multiple redundant measuremtent are available, by the introduction of a new algorithm, here called "Least Squares - Sliding Mode". It guarantees optimality, in the sense of the least squares, and finite estimation time, in the sense of the sliding mode. The Active Control System concept has been formalized in two controller: a nonlinear backstepping controller and a nonlinear composite controller. Particularly interesting is the integration, in the controller design, of the estimations coming from the Diagnosis module. Stability proofs are provided for both the control schemes. Finally, different applications in aerospace have been provided to show the applicability and the effectiveness of the proposed NLGA-based Active Control System.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study presents geo-scientific evidence for Holocene tsunami impact along the shores of the Eastern Ionian Sea. Cefalonia Island, the Gulf of Kyparissia and the Gialova Lagoon were subject of detailed geo-scientific investigations. It is well known that the coasts of the eastern Mediterranean were hit by the destructive influence of tsunamis in the past. The seismically highly active Hellenic Trench is considered as the most significant tsunami source in the Eastern Ionian Sea. This study focuses on the reconstruction and detection of sedimentary signatures of palaeotsunami events and their influence on the Holocene palaeogeographical evolution. The results of fine grained near coast geo-archives are discussed and interpreted in detail to differentiate between tsunami, storm and sea level highstands as sedimentation processes.rnA multi-method approach was applied using geomorphological, sedimentological, geochemical, geophysical and microfaunal analyses to detect Holocene tsunamigenic impact. Chronological data were based on radiocarbondatings and archaeological age estimations to reconstruct local geo-chronostratigraphies and to correlate them on supra-regional scales.rnDistinct sedimentary signatures of 5 generations of tsunami impact were found along the coasts of Cefalonia in the Livadi coastal plain. The results show that the overall coastal evolution was influenced by tsunamigenic impact that occured around 5700 cal BC (I), 4250 cal BC (II), at the beginning of the 2nd millennium cal BC (III), in the 1st millennium cal BC (IV) and posterior to 780 cal AD (V). Sea level reconstructions and the palaeogeographical evolution show that the local Holocene sea level has never been higher than at present.rnAt the former Mouria Lagoon along the Gulf of Kyparissia almost four allochtonous layers of tsunamigenic origin were identified. The stratigraphical record and palaeogeographical reconstructions show that major environmental coastal changes were linked to these extreme events. At the southern end of the Agoulenitsa Lagoon at modern Kato Samikon high-energy traces were found more than 2 km inland and upt ot 9 m above present sea level. The geo-chronological framework deciphered tsunami landfall for the 5th millennium cal BC (I), mid to late 2nd mill. BC (II), Roman times (1st cent. BC to early 4th cent. AD) (III) and most possible one of the historically well-known 365 AD or 521/551 AD tsunamis (IV).rnCoarse-grained allochthonous sediments of marine origin were found intersecting muddy deposits of the quisecent sediments of the Gialova Lagoon on the southwestern Peloponnese. Radiocarbondatings suggest 6 generations of major tsunami impact. Tsunami generations were dated to around 3300 cal BC (I), around the end of 4th and the beginning of 3rd millennium BC (II), after around 1100 cal BC (III), after the 4th to 2nd cent. BC (IV), between the 8th and early 15th cent. AD (V) and between the mid 14th to beginning of 15th cent. AD (VI). Palaeogeographical and morphological characteristics in the environs of the Gialova Lagoon were controlled by high-energy influence.rnSedimentary findings in all study areas are in good accordance to traces of tsunami events found all over the Ionian Sea. The correlation of geo-chronological data fits very well to coastal Akarnania, the western Peloponnese and finding along the coasts of southern Italy and the Aegean. Supra-regional influence of tsunamigenic impact significant for the investigated sites. The palaeogeographical evolution and palaeo-geomorphological setting of the each study area was strongly affected by tsunamigenic impact.rnThe selected geo-archives represent extraordinary sediment traps for the reconstruction of Holocene coastal evolution. Our result therefore give new insight to the exceptional high tsunami risk in the eastern Mediterranean and emphasize the underestimation of the overall tsunami hazard.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Efficient coupling of light to quantum emitters, such as atoms, molecules or quantum dots, is one of the great challenges in current research. The interaction can be strongly enhanced by coupling the emitter to the eva-nescent field of subwavelength dielectric waveguides that offer strong lateral confinement of the guided light. In this context subwavelength diameter optical nanofibers as part of a tapered optical fiber (TOF) have proven to be powerful tool which also provide an efficient transfer of the light from the interaction region to an optical bus, that is to say, from the nanofiber to an optical fiber. rnAnother approach towards enhancing light–matter interaction is to employ an optical resonator in which the light is circulating and thus passes the emitters many times. Here, both approaches are combined by experi-mentally realizing a microresonator with an integrated nanofiber waist. This is achieved by building a fiber-integrated Fabry-Pérot type resonator from two fiber Bragg grating mirrors with a stop-band near the cesium D2-line wavelength. The characteristics of this resonator fulfill the requirements of nonlinear optics, optical sensing, and cavity quantum electrodynamics in the strong-coupling regime. Together with its advantageous features, such as a constant high coupling strength over a large volume, tunability, high transmission outside the mirror stop band, and a monolithic design, this resonator is a promising tool for experiments with nanofiber-coupled atomic ensembles in the strong-coupling regime. rnThe resonator's high sensitivity to the optical properties of the nanofiber provides a probe for changes of phys-ical parameters that affect the guided optical mode, e.g., the temperature via the thermo-optic effect of silica. Utilizing this detection scheme, the thermalization dynamics due to far-field heat radiation of a nanofiber is studied over a large temperature range. This investigation provides, for the first time, a measurement of the total radiated power of an object with a diameter smaller than all absorption lengths in the thermal spectrum at the level of a single object of deterministic shape and material. The results show excellent agreement with an ab initio thermodynamic model that considers heat radiation as a volumetric effect and that takes the emitter shape and size relative to the emission wavelength into account. Modeling and investigating the thermalization of microscopic objects with arbitrary shape from first principles is of fundamental interest and has important applications, such as heat management in nano-devices or radiative forcing of aerosols in Earth's climate system. rnUsing a similar method, the effect of the TOF's mechanical modes on the polarization and phase of the fiber-guided light is studied. The measurement results show that in typical TOFs these quantities exhibit high-frequency thermal fluctuations. They originate from high-Q torsional oscillations that couple to the nanofiber-guided light via the strain-optic effect. An ab-initio opto-mechanical model of the TOF is developed that provides an accurate quantitative prediction for the mode spectrum and the mechanically induced polarization and phase fluctuations. These high-frequency fluctuations may limit the ultimate ideality of fiber-coupling into photonic structures. Furthermore, first estimations show that they may currently limit the storage time of nanofiber-based atom traps. The model, on the other hand, provides a method to design TOFs with tailored mechanical properties in order to meet experimental requirements. rn

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Oceans are key sources and sinks in the global budgets of significant atmospheric trace gases, termed Volatile Organic Compounds (VOCs). Despite their low concentrations, these species have an important role in the atmosphere, influencing ozone photochemistry and aerosol physics. Surprisingly, little work has been done on assessing their emissions or transport mechanisms and rates between ocean and atmosphere, all of which are important when modelling the atmosphere accurately.rnA new Needle Trap Device (NTD) - GC-MS method was developed for the effective sampling and analysis of VOCs in seawater. Good repeatability (RSDs <16 %), linearity (R2 = 0.96 - 0.99) and limits of detection in the range of pM were obtained for DMS, isoprene, benzene, toluene, p-xylene, (+)-α-pinene and (-)-α-pinene. Laboratory evaluation and subsequent field application indicated that the proposed method can be used successfully in place of the more usually applied extraction techniques (P&T, SPME) to extend the suite of species typically measured in the ocean and improve detection limits. rnDuring a mesocosm CO2 enrichment study, DMS, isoprene and α-pinene were identified and quantified in seawater samples, using the above mentioned method. Based on correlations with available biological datasets, the effects of ocean acidification as well as possible ocean biological sources were investigated for all examined compounds. Future ocean's acidity was shown to decrease oceanic DMS production, possibly impact isoprene emissions but not affect the production of α-pinene. rnIn a separate activity, ocean - atmosphere interactions were simulated in a large scale wind-wave canal facility, in order to investigate the gas exchange process and its controlling mechanisms. Air-water exchange rates of 14 chemical species (of which 11 VOCs) spanning a wide range of solubility (dimensionless solubility, α = 0:4 to 5470) and diffusivity (Schmidt number in water, Scw = 594 to 1194) were obtained under various turbulent (wind speed at ten meters height, u10 = 0:8 to 15ms-1) and surfactant modulated (two different sized Triton X-100 layers) surface conditions. Reliable and reproducible total gas transfer velocities were obtained and the derived values and trends were comparable to previous investigations. Through this study, a much better and more comprehensive understanding of the gas exchange process was accomplished. The role of friction velocity, uw* and mean square slope, σs2 in defining phenomena such as waves and wave breaking, near surface turbulence, bubbles and surface films was recognized as very significant. uw* was determined as the ideal turbulent parameter while σs2 described best the related surface conditions. A combination of both uw* and σs2 variables, was found to reproduce faithfully the air-water gas exchange process. rnA Total Transfer Velocity (TTV) model provided by a compilation of 14 tracers and a combination of both uw* and σs2 parameters, is proposed for the first time. Through the proposed TTV parameterization, a new physical perspective is presented which provides an accurate TTV for any tracer within the examined solubility range. rnThe development of such a comprehensive air-sea gas exchange parameterization represents a highly useful tool for regional and global models, providing accurate total transfer velocity estimations for any tracer and any sea-surface status, simplifying the calculation process and eliminating inevitable calculation uncertainty connected with the selection or combination of different parameterizations.rnrn

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present work belongs to the PRANA project, the first extensive field campaign of observation of atmospheric emission spectra covering the Far InfraRed spectral region, for more than two years. The principal deployed instrument is REFIR-PAD, a Fourier transform spectrometer used by us to study Antarctic cloud properties. A dataset covering the whole 2013 has been analyzed and, firstly, a selection of good quality spectra is performed, using, as thresholds, radiance values in few chosen spectral regions. These spectra are described in a synthetic way averaging radiances in selected intervals, converting them into BTs and finally considering the differences between each pair of them. A supervised feature selection algorithm is implemented with the purpose to select the features really informative about the presence, the phase and the type of cloud. Hence, training and test sets are collected, by means of Lidar quick-looks. The supervised classification step of the overall monthly datasets is performed using a SVM. On the base of this classification and with the help of Lidar observations, 29 non-precipitating ice cloud case studies are selected. A single spectrum, or at most an average over two or three spectra, is processed by means of the retrieval algorithm RT-RET, exploiting some main IR window channels, in order to extract cloud properties. Retrieved effective radii and optical depths are analyzed, to compare them with literature studies and to evaluate possible seasonal trends. Finally, retrieval output atmospheric profiles are used as inputs for simulations, assuming two different crystal habits, with the aim to examine our ability to reproduce radiances in the FIR. Substantial mis-estimations are found for FIR micro-windows: a high variability is observed in the spectral pattern of simulation deviations from measured spectra and an effort to link these deviations to cloud parameters has been performed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Body mass index (BMI) is a risk factor for endometrial cancer. We quantified the risk and investigated whether the association differed by use of hormone replacement therapy (HRT), menopausal status, and histologic type. Methods: We searched MEDLINE and EMBASE (1966 to December 2009) to identify prospective studies of BMI and incident endometrial cancer. We did random-effects meta-analyses, meta-regressions, and generalized least square regressions for trend estimations assuming linear, and piecewise linear, relationships. Results: Twenty-four studies (17,710 cases) were analyzed; 9 studies contributed to analyses by HRT, menopausal status, or histologic type, all published since 2003. In the linear model, the overall risk ratio (RR) per 5 kg/m2 increase in BMI was 1.60 (95% CI, 1.52–1.68), P < 0.0001. In the piecewise model, RRs compared with a normal BMI were 1.22 (1.19–1.24), 2.09 (1.94–2.26), 4.36 (3.75–5.10), and 9.11 (7.26–11.51) for BMIs of 27, 32, 37, and 42 kg/m2, respectively. The association was stronger in never HRT users than in ever users: RRs were 1.90 (1.57–2.31) and 1.18 (95% CI, 1.06–1.31) with P for interaction ¼ 0.003. In the piecewise model, the RR in never users was 20.70 (8.28–51.84) at BMI 42 kg/m2, compared with never users at normal BMI. The association was not affected by menopausal status (P ¼ 0.34) or histologic type (P ¼ 0.26). Conclusions: HRT use modifies the BMI-endometrial cancer risk association. Impact: These findings support the hypothesis that hyperestrogenia is an important mechanism underlying the BMI-endometrial cancer association, whilst the presence of residual risk in HRT users points to the role of additional systems. Cancer Epidemiol Biomarkers Prev; 19(12); 3119–30.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Concerns about possible reactions to vaccines or vaccinations are frequently raised. However, the rate of reported vaccine-induced adverse events is low and ranges between 4.8-83.0 per 100,000 doses of the most frequently used vaccines. The number of true allergic reactions to routine vaccines is not known; estimations range from 1 per 500,000 to 1 per 1,000,000 doses for most vaccines. When allergens such as gelatine or egg proteins are components of the formulation, the rate for serious allergic reactions may be higher. Nevertheless, anaphylactic, potentially life-threatening reactions to vaccines are still a rare event (approximately 1 per 1,500,000 doses). The variety of reported vaccine-related adverse events is broad. Most frequently, reactions to vaccines are limited to the injection site and result from a non specific activation of the inflammatory system by, for example, aluminium salts or the active microbial components. If allergy is suspected, an accurate examination followed by algorithms is the key for correct diagnosis, treatment and the decision regarding revaccination in patients with immediate-type reactions to vaccines.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper aims at the development and evaluation of a personalized insulin infusion advisory system (IIAS), able to provide real-time estimations of the appropriate insulin infusion rate for type 1 diabetes mellitus (T1DM) patients using continuous glucose monitors and insulin pumps. The system is based on a nonlinear model-predictive controller (NMPC) that uses a personalized glucose-insulin metabolism model, consisting of two compartmental models and a recurrent neural network. The model takes as input patient's information regarding meal intake, glucose measurements, and insulin infusion rates, and provides glucose predictions. The predictions are fed to the NMPC, in order for the latter to estimate the optimum insulin infusion rates. An algorithm based on fuzzy logic has been developed for the on-line adaptation of the NMPC control parameters. The IIAS has been in silico evaluated using an appropriate simulation environment (UVa T1DM simulator). The IIAS was able to handle various meal profiles, fasting conditions, interpatient variability, intraday variation in physiological parameters, and errors in meal amount estimations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Iterative Closest Point (ICP) is a widely exploited method for point registration that is based on binary point-to-point assignments, whereas the Expectation Conditional Maximization (ECM) algorithm tries to solve the problem of point registration within the framework of maximum likelihood with point-to-cluster matching. In this paper, by fulfilling the implementation of both algorithms as well as conducting experiments in a scenario where dozens of model points must be registered with thousands of observation points on a pelvis model, we investigated and compared the performance (e.g. accuracy and robustness) of both ICP and ECM for point registration in cases without noise and with Gaussian white noise. The experiment results reveal that the ECM method is much less sensitive to initialization and is able to achieve more consistent estimations of the transformation parameters than the ICP algorithm, since the latter easily sinks into local minima and leads to quite different registration results with respect to different initializations. Both algorithms can reach the high registration accuracy at the same level, however, the ICP method usually requires an appropriate initialization to converge globally. In the presence of Gaussian white noise, it is observed in experiments that ECM is less efficient but more robust than ICP.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Extension of 3-D atmospheric data products back into the past is desirable for a wide range of applications. Historical upper-air data are important in this endeavour, particularly in the maritime regions of the tropics and the southern hemisphere, where observations are extremely sparse. Here we present newly digitized and re-evaluated early ship-based upper-air data from two cruises: (1) kite and registering balloon profiles from onboard the ship SMS Planet on a cruise from Europe around South Africa and across the Indian Ocean to the western Pacific in 1906/1907, and (2) ship-based radiosonde data from onboard the MS Schwabenland on a cruise from Europe across the Atlantic to Antarctica and back in 1938/1939. We describe the data and provide estimations of the errors. We compare the data with a recent reanalysis (the Twentieth Century Reanalysis Project, 20CR, Compo et al., 2011) that provides global 3-D data back to the 19th century based on an assimilation of surface pressure data only (plus monthly mean sea-surface temperatures). In cruise (1), the agreement is generally good, but large temperature differences appear during a period with a strong inversion. In cruise (2), after a subset of the data are corrected, close agreement between observations and 20CR is found for geopotential height (GPH) and temperature notwithstanding a likely cold bias of 20CR at the tropopause level. Results are considerably worse for relative humidity, which was reportedly inaccurately measured. Note that comparing 20CR, which has limited skill in the tropical regions, with measurements from ships in remote regions made under sometimes difficult conditions can be considered a worst case assessment. In view of that fact, the anomaly correlations for temperature of 0.3–0.6 in the lower troposphere in cruise (1) and of 0.5–0.7 for tropospheric temperature and GPH in cruise (2) are considered as promising results. Moreover, they are consistent with the error estimations. The results suggest room for further improvement of data products in remote regions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background We present a compendium of N-ethyl-N-nitrosourea (ENU)-induced mouse mutations, identified in our laboratory over a period of 10 years either on the basis of phenotype or whole genome and/or whole exome sequencing, and archived in the Mutagenetix database. Our purpose is threefold: 1) to formally describe many point mutations, including those that were not previously disclosed in peer-reviewed publications; 2) to assess the characteristics of these mutations; and 3) to estimate the likelihood that a missense mutation induced by ENU will create a detectable phenotype. Findings In the context of an ENU mutagenesis program for C57BL/6J mice, a total of 185 phenotypes were tracked to mutations in 129 genes. In addition, 402 incidental mutations were identified and predicted to affect 390 genes. As previously reported, ENU shows strand asymmetry in its induction of mutations, particularly favoring T to A rather than A to T in the sense strand of coding regions and splice junctions. Some amino acid substitutions are far more likely to be damaging than others, and some are far more likely to be observed. Indeed, from among a total of 494 non-synonymous coding mutations, ENU was observed to create only 114 of the 182 possible amino acid substitutions that single base changes can achieve. Based on differences in overt null allele frequencies observed in phenotypic vs. non-phenotypic mutation sets, we infer that ENU-induced missense mutations create detectable phenotype only about 1 in 4.7 times. While the remaining mutations may not be functionally neutral, they are, on average, beneath the limits of detection of the phenotypic assays we applied. Conclusions Collectively, these mutations add to our understanding of the chemical specificity of ENU, the types of amino acid substitutions it creates, and its efficiency in causing phenovariance. Our data support the validity of computational algorithms for the prediction of damage caused by amino acid substitutions, and may lead to refined predictions as to whether specific amino acid changes are responsible for observed phenotypes. These data form the basis for closer in silico estimations of the number of genes mutated to a state of phenovariance by ENU within a population of G3 mice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study summarises all the accessible data on old German chemical weapons dumped in the Baltic Sea. Mr. Goncharov formulated a concept of ecological impact evaluation of chemical warfare agents (CWA) on the marine environment and structured a simulation model adapted to the specific character of the hydrological condition and hydrobiological subjects of the Bornholm Deep. The mathematical model he has created describes the spreading of contaminants by currents and turbulence in the near bottom boundary layer. Parameters of CWA discharge through corrosion of canisters were given for various kinds of bottom sediments with allowance for current velocity. He created a method for integral estimations and a computer simulation model and completed a forecast for CWA "Mustard", which showed that in normal hydrometeorological conditions there are local toxic plumes drifting along the bottom for a distance of up to several kilometres. With storm winds the toxic plumes from separate canisters interflow and lengthen and can reach fishery areas near Bornholm Island. When salt water from the North Sea flows in, the length of toxic zones can increase up to and over 100 kilometres and toxic water masses can spread into the northern Baltic. On this basis, Mr. Goncharov drew up recommendations to reduce dangers for human ecology and proposed the creation of a special system for the forecasting and remote sensing of the environmental conditions of CWA burial places.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The soluble and stable fibrin monomer-fibrinogen complex (SF) is well known to be present in the circulating blood of healthy individuals and of patients with thrombotic diseases. However, its physiological role is not yet fully understood. To deepen our knowledge about this complex, a method for the quantitative analysis of interaction between soluble fibrin monomers and surface-immobilized fibrinogen has been established by means of resonant mirror (IAsys) and surface plasmon resonance (BIAcore) biosensors. The protocols have been optimized and validated by choosing appropriate immobilization procedures with regeneration steps and suitable fibrin concentrations. The highly specific binding of fibrin monomers to immobilized fibrin(ogen), or vice versa, was characterized by an affinity constant of approximately 10(-8)M, which accords better with the direct dissociation of fibrin triads (KD approximately 10(-8) -10(-9) M) (J. R. Shainoff and B. N. Dardik, Annals of the New York Academy of Science, 1983, Vol. 27, pp. 254-268) than with earlier estimations of the KD for the fibrin-fibrinogen complex (KD approximately 10(-6) M) (J. L. Usero, C. Izquierdo, F. J. Burguillo, M. G. Roig, A. del Arco, and M. A. Herraez, International Journal of Biochemistry, 1981, Vol. 13, pp. 1191-1196).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Granger causality (GC) is a statistical technique used to estimate temporal associations in multivariate time series. Many applications and extensions of GC have been proposed since its formulation by Granger in 1969. Here we control for potentially mediating or confounding associations between time series in the context of event-related electrocorticographic (ECoG) time series. A pruning approach to remove spurious connections and simultaneously reduce the required number of estimations to fit the effective connectivity graph is proposed. Additionally, we consider the potential of adjusted GC applied to independent components as a method to explore temporal relationships between underlying source signals. Both approaches overcome limitations encountered when estimating many parameters in multivariate time-series data, an increasingly common predicament in today's brain mapping studies.