826 resultados para Representation of time
Resumo:
Contexte. Les études cas-témoins sont très fréquemment utilisées par les épidémiologistes pour évaluer l’impact de certaines expositions sur une maladie particulière. Ces expositions peuvent être représentées par plusieurs variables dépendant du temps, et de nouvelles méthodes sont nécessaires pour estimer de manière précise leurs effets. En effet, la régression logistique qui est la méthode conventionnelle pour analyser les données cas-témoins ne tient pas directement compte des changements de valeurs des covariables au cours du temps. Par opposition, les méthodes d’analyse des données de survie telles que le modèle de Cox à risques instantanés proportionnels peuvent directement incorporer des covariables dépendant du temps représentant les histoires individuelles d’exposition. Cependant, cela nécessite de manipuler les ensembles de sujets à risque avec précaution à cause du sur-échantillonnage des cas, en comparaison avec les témoins, dans les études cas-témoins. Comme montré dans une étude de simulation précédente, la définition optimale des ensembles de sujets à risque pour l’analyse des données cas-témoins reste encore à être élucidée, et à être étudiée dans le cas des variables dépendant du temps. Objectif: L’objectif général est de proposer et d’étudier de nouvelles versions du modèle de Cox pour estimer l’impact d’expositions variant dans le temps dans les études cas-témoins, et de les appliquer à des données réelles cas-témoins sur le cancer du poumon et le tabac. Méthodes. J’ai identifié de nouvelles définitions d’ensemble de sujets à risque, potentiellement optimales (le Weighted Cox model and le Simple weighted Cox model), dans lesquelles différentes pondérations ont été affectées aux cas et aux témoins, afin de refléter les proportions de cas et de non cas dans la population source. Les propriétés des estimateurs des effets d’exposition ont été étudiées par simulation. Différents aspects d’exposition ont été générés (intensité, durée, valeur cumulée d’exposition). Les données cas-témoins générées ont été ensuite analysées avec différentes versions du modèle de Cox, incluant les définitions anciennes et nouvelles des ensembles de sujets à risque, ainsi qu’avec la régression logistique conventionnelle, à des fins de comparaison. Les différents modèles de régression ont ensuite été appliqués sur des données réelles cas-témoins sur le cancer du poumon. Les estimations des effets de différentes variables de tabac, obtenues avec les différentes méthodes, ont été comparées entre elles, et comparées aux résultats des simulations. Résultats. Les résultats des simulations montrent que les estimations des nouveaux modèles de Cox pondérés proposés, surtout celles du Weighted Cox model, sont bien moins biaisées que les estimations des modèles de Cox existants qui incluent ou excluent simplement les futurs cas de chaque ensemble de sujets à risque. De plus, les estimations du Weighted Cox model étaient légèrement, mais systématiquement, moins biaisées que celles de la régression logistique. L’application aux données réelles montre de plus grandes différences entre les estimations de la régression logistique et des modèles de Cox pondérés, pour quelques variables de tabac dépendant du temps. Conclusions. Les résultats suggèrent que le nouveau modèle de Cox pondéré propose pourrait être une alternative intéressante au modèle de régression logistique, pour estimer les effets d’expositions dépendant du temps dans les études cas-témoins
Resumo:
Faute de droits d'auteurs pour les captures d'écrans, mon document ne contient pas d'images. Si vous voudriez consulter ma thèse avec les images, veuillez me contacter.
Resumo:
We provide a representation theorem for risk measures satisfying (i) monotonicity; (ii) positive homogeneity; and (iii) translation invariance. As a simple corollary to our theorem, we obtain the usual representation of coherent risk measures (i.e., risk measures that are, in addition, sub-additive; see Artzner et al. [2]).
Resumo:
The present research problem is to study the existing encryption methods and to develop a new technique which is performance wise superior to other existing techniques and at the same time can be very well incorporated in the communication channels of Fault Tolerant Hard Real time systems along with existing Error Checking / Error Correcting codes, so that the intention of eaves dropping can be defeated. There are many encryption methods available now. Each method has got it's own merits and demerits. Similarly, many crypt analysis techniques which adversaries use are also available.
Resumo:
We propose to show in this paper, that the time series obtained from biological systems such as human brain are invariably nonstationary because of different time scales involved in the dynamical process. This makes the invariant parameters time dependent. We made a global analysis of the EEG data obtained from the eight locations on the skull space and studied simultaneously the dynamical characteristics from various parts of the brain. We have proved that the dynamical parameters are sensitive to the time scales and hence in the study of brain one must identify all relevant time scales involved in the process to get an insight in the working of brain.
Resumo:
Sonar signal processing comprises of a large number of signal processing algorithms for implementing functions such as Target Detection, Localisation, Classification, Tracking and Parameter estimation. Current implementations of these functions rely on conventional techniques largely based on Fourier Techniques, primarily meant for stationary signals. Interestingly enough, the signals received by the sonar sensors are often non-stationary and hence processing methods capable of handling the non-stationarity will definitely fare better than Fourier transform based methods.Time-frequency methods(TFMs) are known as one of the best DSP tools for nonstationary signal processing, with which one can analyze signals in time and frequency domains simultaneously. But, other than STFT, TFMs have been largely limited to academic research because of the complexity of the algorithms and the limitations of computing power. With the availability of fast processors, many applications of TFMs have been reported in the fields of speech and image processing and biomedical applications, but not many in sonar processing. A structured effort, to fill these lacunae by exploring the potential of TFMs in sonar applications, is the net outcome of this thesis. To this end, four TFMs have been explored in detail viz. Wavelet Transform, Fractional Fourier Transfonn, Wigner Ville Distribution and Ambiguity Function and their potential in implementing five major sonar functions has been demonstrated with very promising results. What has been conclusively brought out in this thesis, is that there is no "one best TFM" for all applications, but there is "one best TFM" for each application. Accordingly, the TFM has to be adapted and tailored in many ways in order to develop specific algorithms for each of the applications.
Resumo:
This thesis is an attempt to Provenence, Sedimentetion and Geochemistry of the Modern Sediments of the Mud Banks off the Central Kerela Coast, India. In the present doctoral work, an attempt has been made to study in detail the mud banks of central Kerala, i.e. of Narakkal, Saudi and Purakkad areas which are reported as permanent mud banks, since olden days. The studies have been conducted during the years 1985 and 1986. The important findings of the study is stated as clay mineralogical studies of the rivers, lake and mud bank sediments reveal that the dominant clay mineral is kaolinite followed by montmorillonite, illite and gibbsite. Geochemical analysis of the Vembanad lake and mud bank sediments show that the iron and manganese are widely distributed both in the lake and mud bank sediments
Resumo:
This study is concerned with Autoregressive Moving Average (ARMA) models of time series. ARMA models form a subclass of the class of general linear models which represents stationary time series, a phenomenon encountered most often in practice by engineers, scientists and economists. It is always desirable to employ models which use parameters parsimoniously. Parsimony will be achieved by ARMA models because it has only finite number of parameters. Even though the discussion is primarily concerned with stationary time series, later we will take up the case of homogeneous non stationary time series which can be transformed to stationary time series. Time series models, obtained with the help of the present and past data is used for forecasting future values. Physical science as well as social science take benefits of forecasting models. The role of forecasting cuts across all fields of management-—finance, marketing, production, business economics, as also in signal process, communication engineering, chemical processes, electronics etc. This high applicability of time series is the motivation to this study.
Squeezed Coherent State Representation of Scalar Field and Particle Production in the Early Universe
Resumo:
The present work is an attempt to explain particle production in the early univese. We argue that nonzero values of the stress-energy tensor evaluated in squeezed vacuum state can be due to particle production and this supports the concept of particle production from zero-point quantum fluctuations. In the present calculation we use the squeezed coherent state introduced by Fan and Xiao [7]. The vacuum expectation values of stressenergy tensor defined prior to any dynamics in the background gravitational field give all information about particle production. Squeezing of the vacuum is achieved by means of the background gravitational field, which plays the role of a parametric amplifier [8]. The present calculation shows that the vacuum expectation value of the energy density and pressure contain terms in addition to the classical zero-point energy terms. The calculation of the particle production probability shows that the probability increases as the squeezing parameter increases, reaches a maximum value, and then decreases.
Squeezed Coherent State Representation of Scalar Field and Particle Production in the Early Universe
Resumo:
The present work is an attempt to explain particle production in the early univese. We argue that nonzero values of the stress-energy tensor evaluated in squeezed vacuum state can be due to particle production and this supports the concept of particle production from zero-point quantum fluctuations. In the present calculation we use the squeezed coherent state introduced by Fan and Xiao [7]. The vacuum expectation values of stressenergy tensor defined prior to any dynamics in the background gravitational field give all information about particle production. Squeezing of the vacuum is achieved by means of the background gravitational field, which plays the role of a parametric amplifier [8]. The present calculation shows that the vacuum expectation value of the energy density and pressure contain terms in addition to the classical zero-point energy terms. The calculation of the particle production probability shows that the probability increases as the squeezing parameter increases, reaches a maximum value, and then decreases.
Resumo:
Organic Molecules: Depiction of Structure - The Basics. Powerpoint presentation of A Level revision material for 1st year undergraduates written by Jeremy Hinks, School of Chemistry in 2002.
Resumo:
Introduction: Comprehensive undergraduate education in clinical sciences is grounded on activities developed during clerkships. To implement the credits system we must know how these experiences take place. Objectives: to describe how students spend time in clerkships, how they assess the educative value of activities and the enjoyment it provides. Method: We distributed a form to a random clustered sample of a 100 students coursing clinical sciences, designed to record the time spent, and to assess the educative value and the grade of enjoyment of the activities in clerkship during a week. Data were registered and analyzed on Excel® 98 and SPSS. Results: mean time spent by students in clerkship activities on a day were 10.8 hours. Of those, 7.3 hours (69%) were spent in formal education activities. Patient care activities with teachers occupied the major proportion of time (15.4%). Of the teaching and learning activities in a week, 28 hours (56%) were spent in patient care activities and 22.4 hours (44.5%) were used in independent academic work. The time spent in teaching and learning activities correspond to 19 credits of a semester of 18 weeks. The activities assessed as having the major educational value were homework activities (4.6) and formal education activities (4.5). The graded as most enjoyable were extracurricular activities, formal educational activities and independent academic work. Conclusion: our students spend more time in activities with patients than the reported in literature. The attending workload of our students is greater than the one reported in similar studies.
Resumo:
Resumen tomado de la publicación. Con el apoyo económico del departamento MIDE de la UNED
Resumo:
The thematic and anathematic in music-making a reflection occurring after listening to an art song and a pop song from the 19th and 20th century Levantine music. Bach’s Orchestral Suites keep popping up, elegantly unveiling “the truth”.