961 resultados para continuous-time asymptotics


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The topic of this thesis is the feedback stabilization of the attitude of magnetically actuated spacecraft. The use of magnetic coils is an attractive solution for the generation of control torques on small satellites flying inclined low Earth orbits, since magnetic control systems are characterized by reduced weight and cost, higher reliability, and require less power with respect to other kinds of actuators. At the same time, the possibility of smooth modulation of control torques reduces coupling of the attitude control system with flexible modes, thus preserving pointing precision with respect to the case when pulse-modulated thrusters are used. The principle based on the interaction between the Earth's magnetic field and the magnetic field generated by the set of coils introduces an inherent nonlinearity, because control torques can be delivered only in a plane that is orthogonal to the direction of the geomagnetic field vector. In other words, the system is underactuated, because the rotational degrees of freedom of the spacecraft, modeled as a rigid body, exceed the number of independent control actions. The solution of the control issue for underactuated spacecraft is also interesting in the case of actuator failure, e.g. after the loss of a reaction-wheel in a three-axes stabilized spacecraft with no redundancy. The application of well known control strategies is no longer possible in this case for both regulation and tracking, so that new methods have been suggested for tackling this particular problem. The main contribution of this thesis is to propose continuous time-varying controllers that globally stabilize the attitude of a spacecraft, when magneto-torquers alone are used and when a momentum-wheel supports magnetic control in order to overcome the inherent underactuation. A kinematic maneuver planning scheme, stability analyses, and detailed simulation results are also provided, with new theoretical developments and particular attention toward application considerations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The relatively young discipline of astronautics represents one of the scientifically most fascinating and technologically advanced achievements of our time. The human exploration in space does not offer only extraordinary research possibilities but also demands high requirements from man and technology. The space environment provides a lot of attractive experimental tools towards the understanding of fundamental mechanism in natural sciences. It has been shown that especially reduced gravity and elevated radiation, two distinctive factors in space, influence the behavior of biological systems significantly. For this reason one of the key objectives on board of an earth orbiting laboratory is the research in the field of life sciences, covering the broad range from botany, human physiology and crew health up to biotechnology. The Columbus Module is the only European low gravity platform that allows researchers to perform ambitious experiments in a continuous time frame up to several months. Biolab is part of the initial outfitting of the Columbus Laboratory; it is a multi-user facility supporting research in the field of biology, e.g. effect of microgravity and space radiation on cell cultures, micro-organisms, small plants and small invertebrates. The Biolab IEC are projects designed to work in the automatic part of Biolab. In this moment in the TO-53 department of Airbus Defence & Space (formerly Astrium) there are two experiments that are in phase C/D of the development and they are the subject of this thesis: CELLRAD and CYTOSKELETON. They will be launched in soft configuration, that means packed inside a block of foam that has the task to reduce the launch loads on the payload. Until 10 years ago the payloads which were launched in soft configuration were supposed to be structural safe by themselves and a specific structural analysis could be waived on them; with the opening of the launchers market to private companies (that are not under the direct control of the international space agencies), the requirements on the verifications of payloads are changed and they have become much more conservative. In 2012 a new random environment has been introduced due to the new Space-X launch specification that results to be particularly challenging for the soft launched payloads. The last ESA specification requires to perform structural analysis on the payload for combined loads (random vibration, quasi-steady acceleration and pressure). The aim of this thesis is to create FEM models able to reproduce the launch configuration and to verify that all the margins of safety are positive and to show how they change because of the new Space-X random environment. In case the results are negative, improved design solution are implemented. Based on the FEM result a study of the joins has been carried out and, when needed, a crack growth analysis has been performed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We report on the wind radiometer WIRA, a new ground-based microwave Doppler-spectro-radiometer specifically designed for the measurement of middle-atmospheric horizontal wind by observing ozone emission spectra at 142.17504 GHz. Currently, wind speeds in five levels between 30 and 79 km can be retrieved which makes WIRA the first instrument able to continuously measure horizontal wind in this altitude range. For an integration time of one day the measurement error on each level lies at around 25 m s−1. With a planned upgrade this value is expected to be reduced by a factor of 2 in the near future. On the altitude levels where our measurement can be compared to wind data from the European Centre for Medium-Range Weather Forecasts (ECMWF) very good agreement in the long-term statistics as well as in short time structures with a duration of a few days has been found. WIRA uses a passive double sideband heterodyne receiver together with a digital Fourier transform spectrometer for the data acquisition. A big advantage of the radiometric approach is that such instruments can also operate under adverse weather conditions and thus provide a continuous time series for the given location. The optics enables the instrument to scan a wide range of azimuth angles including the directions east, west, north, and south for zonal and meridional wind measurements. The design of the radiometer is fairly compact and its calibration does not rely on liquid nitrogen which makes it transportable and suitable for campaign use. WIRA is conceived in a way that it can be operated remotely and does hardly require any maintenance. In the present paper, a description of the instrument is given, and the techniques used for the wind retrieval based on the determination of the Doppler shift of the measured atmospheric ozone emission spectra are outlined. Their reliability was tested using Monte Carlo simulations. Finally, a time series of 11 months of zonal wind measurements over Bern (46°57′ N, 7°26′ E) is presented and compared to ECMWF wind data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is well known that unrecognized heterogeneity among patients, such as is conferred by genetic subtype, can undermine the power of randomized trial, designed under the assumption of homogeneity, to detect a truly beneficial treatment. We consider the conditional power approach to allow for recovery of power under unexplained heterogeneity. While Proschan and Hunsberger (1995) confined the application of conditional power design to normally distributed observations, we consider more general and difficult settings in which the data are in the framework of continuous time and are subject to censoring. In particular, we derive a procedure appropriate for the analysis of the weighted log rank test under the assumption of a proportional hazards frailty model. The proposed method is illustrated through application to a brain tumor trial.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fossil pollen data from stratigraphic cores are irregularly spaced in time due to non-linear age-depth relations. Moreover, their marginal distributions may vary over time. We address these features in a nonparametric regression model with errors that are monotone transformations of a latent continuous-time Gaussian process Z(T). Although Z(T) is unobserved, due to monotonicity, under suitable regularity conditions, it can be recovered facilitating further computations such as estimation of the long-memory parameter and the Hermite coefficients. The estimation of Z(T) itself involves estimation of the marginal distribution function of the regression errors. These issues are considered in proposing a plug-in algorithm for optimal bandwidth selection and construction of confidence bands for the trend function. Some high-resolution time series of pollen records from Lago di Origlio in Switzerland, which go back ca. 20,000 years are used to illustrate the methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the observation that stochasticity is important in biological systems, chemical kinetics have begun to receive wider interest. While the use of Monte Carlo discrete event simulations most accurately capture the variability of molecular species, they become computationally costly for complex reaction-diffusion systems with large populations of molecules. On the other hand, continuous time models are computationally efficient but they fail to capture any variability in the molecular species. In this study a hybrid stochastic approach is introduced for simulating reaction-diffusion systems. We developed an adaptive partitioning strategy in which processes with high frequency are simulated with deterministic rate-based equations, and those with low frequency using the exact stochastic algorithm of Gillespie. Therefore the stochastic behavior of cellular pathways is preserved while being able to apply it to large populations of molecules. We describe our method and demonstrate its accuracy and efficiency compared with the Gillespie algorithm for two different systems. First, a model of intracellular viral kinetics with two steady states and second, a compartmental model of the postsynaptic spine head for studying the dynamics of Ca+2 and NMDA receptors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We introduce a multistable subordinator, which generalizes the stable subordinator to the case of time-varying stability index. This enables us to define a multifractional Poisson process. We study properties of these processes and establish the convergence of a continuous-time random walk to the multifractional Poisson process.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The study investigates the role of credit risk in a continuous time stochastic asset allocation model, since the traditional dynamic framework does not provide credit risk flexibility. The general model of the study extends the traditional dynamic efficiency framework by explicitly deriving the optimal value function for the infinite horizon stochastic control problem via a weighted volatility measure of market and credit risk. The model's optimal strategy was then compared to that obtained from a benchmark Markowitz-type dynamic optimization framework to determine which specification adequately reflects the optimal terminal investment returns and strategy under credit and market risks. The paper shows that an investor's optimal terminal return is lower than typically indicated under the traditional mean-variance framework during periods of elevated credit risk. Hence I conclude that, while the traditional dynamic mean-variance approach may indicate the ideal, in the presence of credit-risk it does not accurately reflect the observed optimal returns, terminal wealth and portfolio selection strategies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Five sections drilled in multiple holes over a depth transect of more than 2200 m at the Walvis Ridge (SE Atlantic) during Ocean Drilling Program (ODP) Leg 208 resulted in the first complete early Paleogene deep-sea record. Here we present high-resolution stratigraphic records spanning a ~4.3 million yearlong interval of the late Paleocene to early Eocene. This interval includes the Paleocene-Eocene thermal maximum (PETM) as well as the Eocene thermal maximum (ETM) 2 event. A detailed chronology was developed with nondestructive X-ray fluorescence (XRF) core scanning records and shipboard color data. These records were used to refine the shipboard-derived spliced composite depth for each site and with a record from ODP Site 1051 were then used to establish a continuous time series over this interval. Extensive spectral analysis reveals that the early Paleogene sedimentary cyclicity is dominated by precession modulated by the short (100 kyr) and long (405 kyr) eccentricity cycles. Counting of precession-related cycles at multiple sites results in revised estimates for the duration of magnetochrons C24r and C25n. Direct comparison between the amplitude modulation of the precession component derived from XRF data and recent models of Earth's orbital eccentricity suggests that the onset of the PETM and ETM2 are related to a 100-kyr eccentricity maximum. Both events are approximately a quarter of a period offset from a maximum in the 405-kyr eccentricity cycle, with the major difference that the PETM is lagging and ETM2 is leading a 405-kyr eccentricity maximum. Absolute age estimates for the PETM, ETM2, and the magnetochron boundaries that are consistent with recalibrated radiometric ages and recent models of Earth's orbital eccentricity cannot be precisely determined at present because of too large uncertainties in these methods. Nevertheless, we provide two possible tuning options, which demonstrate the potential for the development of a cyclostratigraphic framework based on the stable 405-kyr eccentricity cycle for the entire Paleogene.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ocean acidification and warming are expected to threaten the persistence of tropical coral reef ecosystems. As coral reefs face multiple stressors, the distribution and abundance of corals will depend on the successful dispersal and settlement of coral larvae under changing environmental conditions. To explore this scenario, we used metabolic rate, at holobiont and molecular levels, as an index for assessing the physiological plasticity of Pocillopora damicornis larvae from this site to conditions of ocean acidity and warming. Larvae were incubated for 6 hours in seawater containing combinations of CO2 concentration (450 and 950 µatm) and temperature (28 and 30°C). Rates of larval oxygen consumption were higher at elevated temperatures. In contrast, high CO2 levels elicited depressed metabolic rates, especially for larvae released later in the spawning period. Rates of citrate synthase, a rate-limiting enzyme in aerobic metabolism, suggested a biochemical limit for increasing oxidative capacity in coral larvae in a warming, acidifying ocean. Biological responses were also compared between larvae released from adult colonies on the same day (cohorts). The metabolic physiology of Pocillopora damicornis larvae varied significantly by day of release. Additionally, we used environmental data collected on a reef in Moorea, French Polynesia to provide information about what adult corals and larvae may currently experience in the field. An autonomous pH sensor provided a continuous time series of pH on the natal fringing reef. In February/March, 2011, pH values averaged 8.075±0.023. Our results suggest that without adaptation or acclimatization, only a portion of naïve Pocillopora damicornis larvae may have suitable metabolic phenotypes for maintaining function and fitness in an end-of-the century ocean.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The extraordinary increase of new information technologies, the development of Internet, the electronic commerce, the e-government, mobile telephony and future cloud computing and storage, have provided great benefits in all areas of society. Besides these, there are new challenges for the protection of information, such as the loss of confidentiality and integrity of electronic documents. Cryptography plays a key role by providing the necessary tools to ensure the safety of these new media. It is imperative to intensify the research in this area, to meet the growing demand for new secure cryptographic techniques. The theory of chaotic nonlinear dynamical systems and the theory of cryptography give rise to the chaotic cryptography, which is the field of study of this thesis. The link between cryptography and chaotic systems is still subject of intense study. The combination of apparently stochastic behavior, the properties of sensitivity to initial conditions and parameters, ergodicity, mixing, and the fact that periodic points are dense, suggests that chaotic orbits resemble random sequences. This fact, and the ability to synchronize multiple chaotic systems, initially described by Pecora and Carroll, has generated an avalanche of research papers that relate cryptography and chaos. The chaotic cryptography addresses two fundamental design paradigms. In the first paradigm, chaotic cryptosystems are designed using continuous time, mainly based on chaotic synchronization techniques; they are implemented with analog circuits or by computer simulation. In the second paradigm, chaotic cryptosystems are constructed using discrete time and generally do not depend on chaos synchronization techniques. The contributions in this thesis involve three aspects about chaotic cryptography. The first one is a theoretical analysis of the geometric properties of some of the most employed chaotic attractors for the design of chaotic cryptosystems. The second one is the cryptanalysis of continuos chaotic cryptosystems and finally concludes with three new designs of cryptographically secure chaotic pseudorandom generators. The main accomplishments contained in this thesis are: v Development of a method for determining the parameters of some double scroll chaotic systems, including Lorenz system and Chua’s circuit. First, some geometrical characteristics of chaotic system have been used to reduce the search space of parameters. Next, a scheme based on the synchronization of chaotic systems was built. The geometric properties have been employed as matching criterion, to determine the values of the parameters with the desired accuracy. The method is not affected by a moderate amount of noise in the waveform. The proposed method has been applied to find security flaws in the continuous chaotic encryption systems. Based on previous results, the chaotic ciphers proposed by Wang and Bu and those proposed by Xu and Li are cryptanalyzed. We propose some solutions to improve the cryptosystems, although very limited because these systems are not suitable for use in cryptography. Development of a method for determining the parameters of the Lorenz system, when it is used in the design of two-channel cryptosystem. The method uses the geometric properties of the Lorenz system. The search space of parameters has been reduced. Next, the parameters have been accurately determined from the ciphertext. The method has been applied to cryptanalysis of an encryption scheme proposed by Jiang. In 2005, Gunay et al. proposed a chaotic encryption system based on a cellular neural network implementation of Chua’s circuit. This scheme has been cryptanalyzed. Some gaps in security design have been identified. Based on the theoretical results of digital chaotic systems and cryptanalysis of several chaotic ciphers recently proposed, a family of pseudorandom generators has been designed using finite precision. The design is based on the coupling of several piecewise linear chaotic maps. Based on the above results a new family of chaotic pseudorandom generators named Trident has been designed. These generators have been specially designed to meet the needs of real-time encryption of mobile technology. According to the above results, this thesis proposes another family of pseudorandom generators called Trifork. These generators are based on a combination of perturbed Lagged Fibonacci generators. This family of generators is cryptographically secure and suitable for use in real-time encryption. Detailed analysis shows that the proposed pseudorandom generator can provide fast encryption speed and a high level of security, at the same time. El extraordinario auge de las nuevas tecnologías de la información, el desarrollo de Internet, el comercio electrónico, la administración electrónica, la telefonía móvil y la futura computación y almacenamiento en la nube, han proporcionado grandes beneficios en todos los ámbitos de la sociedad. Junto a éstos, se presentan nuevos retos para la protección de la información, como la suplantación de personalidad y la pérdida de la confidencialidad e integridad de los documentos electrónicos. La criptografía juega un papel fundamental aportando las herramientas necesarias para garantizar la seguridad de estos nuevos medios, pero es imperativo intensificar la investigación en este ámbito para dar respuesta a la demanda creciente de nuevas técnicas criptográficas seguras. La teoría de los sistemas dinámicos no lineales junto a la criptografía dan lugar a la ((criptografía caótica)), que es el campo de estudio de esta tesis. El vínculo entre la criptografía y los sistemas caóticos continúa siendo objeto de un intenso estudio. La combinación del comportamiento aparentemente estocástico, las propiedades de sensibilidad a las condiciones iniciales y a los parámetros, la ergodicidad, la mezcla, y que los puntos periódicos sean densos asemejan las órbitas caóticas a secuencias aleatorias, lo que supone su potencial utilización en el enmascaramiento de mensajes. Este hecho, junto a la posibilidad de sincronizar varios sistemas caóticos descrita inicialmente en los trabajos de Pecora y Carroll, ha generado una avalancha de trabajos de investigación donde se plantean muchas ideas sobre la forma de realizar sistemas de comunicaciones seguros, relacionando así la criptografía y el caos. La criptografía caótica aborda dos paradigmas de diseño fundamentales. En el primero, los criptosistemas caóticos se diseñan utilizando circuitos analógicos, principalmente basados en las técnicas de sincronización caótica; en el segundo, los criptosistemas caóticos se construyen en circuitos discretos u ordenadores, y generalmente no dependen de las técnicas de sincronización del caos. Nuestra contribución en esta tesis implica tres aspectos sobre el cifrado caótico. En primer lugar, se realiza un análisis teórico de las propiedades geométricas de algunos de los sistemas caóticos más empleados en el diseño de criptosistemas caóticos vii continuos; en segundo lugar, se realiza el criptoanálisis de cifrados caóticos continuos basados en el análisis anterior; y, finalmente, se realizan tres nuevas propuestas de diseño de generadores de secuencias pseudoaleatorias criptográficamente seguros y rápidos. La primera parte de esta memoria realiza un análisis crítico acerca de la seguridad de los criptosistemas caóticos, llegando a la conclusión de que la gran mayoría de los algoritmos de cifrado caóticos continuos —ya sean realizados físicamente o programados numéricamente— tienen serios inconvenientes para proteger la confidencialidad de la información ya que son inseguros e ineficientes. Asimismo una gran parte de los criptosistemas caóticos discretos propuestos se consideran inseguros y otros no han sido atacados por lo que se considera necesario más trabajo de criptoanálisis. Esta parte concluye señalando las principales debilidades encontradas en los criptosistemas analizados y algunas recomendaciones para su mejora. En la segunda parte se diseña un método de criptoanálisis que permite la identificaci ón de los parámetros, que en general forman parte de la clave, de algoritmos de cifrado basados en sistemas caóticos de Lorenz y similares, que utilizan los esquemas de sincronización excitador-respuesta. Este método se basa en algunas características geométricas del atractor de Lorenz. El método diseñado se ha empleado para criptoanalizar eficientemente tres algoritmos de cifrado. Finalmente se realiza el criptoanálisis de otros dos esquemas de cifrado propuestos recientemente. La tercera parte de la tesis abarca el diseño de generadores de secuencias pseudoaleatorias criptográficamente seguras, basadas en aplicaciones caóticas, realizando las pruebas estadísticas, que corroboran las propiedades de aleatoriedad. Estos generadores pueden ser utilizados en el desarrollo de sistemas de cifrado en flujo y para cubrir las necesidades del cifrado en tiempo real. Una cuestión importante en el diseño de sistemas de cifrado discreto caótico es la degradación dinámica debida a la precisión finita; sin embargo, la mayoría de los diseñadores de sistemas de cifrado discreto caótico no ha considerado seriamente este aspecto. En esta tesis se hace hincapié en la importancia de esta cuestión y se contribuye a su esclarecimiento con algunas consideraciones iniciales. Ya que las cuestiones teóricas sobre la dinámica de la degradación de los sistemas caóticos digitales no ha sido totalmente resuelta, en este trabajo utilizamos algunas soluciones prácticas para evitar esta dificultad teórica. Entre las técnicas posibles, se proponen y evalúan varias soluciones, como operaciones de rotación de bits y desplazamiento de bits, que combinadas con la variación dinámica de parámetros y con la perturbación cruzada, proporcionan un excelente remedio al problema de la degradación dinámica. Además de los problemas de seguridad sobre la degradación dinámica, muchos criptosistemas se rompen debido a su diseño descuidado, no a causa de los defectos esenciales de los sistemas caóticos digitales. Este hecho se ha tomado en cuenta en esta tesis y se ha logrado el diseño de generadores pseudoaleatorios caóticos criptogr áficamente seguros.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

For taxonomic levels higher than species, the abundance distributions of the number of subtaxa per taxon tend to approximate power laws but often show strong deviations from such laws. Previously, these deviations were attributed to finite-time effects in a continuous-time branching process at the generic level. Instead, we describe herein a simple discrete branching process that generates the observed distributions and find that the distribution's deviation from power law form is not caused by disequilibration, but rather that it is time independent and determined by the evolutionary properties of the taxa of interest. Our model predicts—with no free parameters—the rank-frequency distribution of the number of families in fossil marine animal orders obtained from the fossil record. We find that near power law distributions are statistically almost inevitable for taxa higher than species. The branching model also sheds light on species-abundance patterns, as well as on links between evolutionary processes, self-organized criticality, and fractals.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We explore the role of business services in knowledge accumulation and growth and the determinants of knowledge diffusion including the role of distance. A continuous time model is estimated on several European countries, Japan, and the US. Policy simulations illustrate the benefits for EU growth of the deepening of the single market, the reduction of regulatory barriers, and the accumulation of technology and human capital. Our results support the basic insights of the Lisbon Agenda. Economic growth in Europe is enhanced to the extent that: trade in services increases, technology accumulation and diffusion increase, regulation becomes both less intensive and more uniform across countries, and human capital accumulation increases in all countries.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study the impact of the different stages of human capital accumulation on the evolution of labor productivity in a model calibrated to the U.S. from 1961 to 2008. We add early childhood education to a standard continuous time life cycle economy and assume complementarity between educational stages. There are three sectors in the model: the goods sector, the early childhood sector and the formal education sector. Agents are homogenous and choose the intensity of preschool education, how long to stay in formal school, labor effort and consumption, and there are exogenous distortions to these four decisions. The model matches the data very well and closely reproduces the paths of schooling, hours worked, relative prices and GDP. We find that the reduction in distortions to early education in the period was large and made a very strong contribution to human capital accumulation. However, due to general equilibrium effects of labor market taxation, marginal modification in the incentives for early education in 2008 had a smaller impact than those for formal education. This is because the former do not decisively affect the decision to join the labor market, while the latter do. Without labor taxation, incentives for preschool are significantly stronger.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dataset contains continuous time series of land surface temperature (LST) at spatial resolution of 300m around the 12 experimental sites of the PAGE21 project (grant agreement number 282700, funded by the EC seventh Framework Program theme FP7-ENV-2011). This dataset was produced from hourly LST time series at 25km scale, retrieved from SSM/I data (André et al., 2015, doi:10.1016/j.rse.2015.01.028) and downscaled to 300m using a dynamic model and a particle smoothing approach. This methodology is based on two main assumptions. First, LST spatial variability is mostly explained by land cover and soil hydric state. Second, LST is unique for a land cover class within the low resolution pixel. Given these hypotheses, this variable can be estimated using a land cover map and a physically based land surface model constrained with observations using a data assimilation process. This methodology described in Mechri et al. (2014, doi:10.1002/2013JD020354) was applied to the ORCHIDEE land surface model (Krinner et al., 2005, doi:10.1029/2003GB002199) to estimate prior values of each land cover class provided by the ESA CCI-Land Cover product (Bontemps et al., 2013) at 300m resolution . The assimilation process (particle smoother) consists in simulating ensemble of LST time series for each land cover class and for a large number of parameter sets. For each parameter set, the resulting temperatures are aggregated considering the grid fraction of each land cover and compared to the coarse observations. Miniminizing the distance between the aggregated model solutions and the observations allow us to select the simulated LST and the corresponding parameter sets which fit the observations most closely. The retained parameter sets are then duplicated and randomly perturbed before simulating the next time window. At the end, the most likely LST of each land cover class are estimated and used to reconstruct LST maps at 300m resolution using ESA CCI-Land Cover. The resulting temperature maps on which ice pixels were masked, are provided at daily time step during the nine-year analysis period (2000-2009).