808 resultados para Empirical Predictions
Resumo:
L'outil développé dans le cadre de cette thèse est disponible à l'adresse suivante: www.astro.umontreal.ca/~malo/banyan.php
Resumo:
Social support and coping are both related to posttraumatic stress disorder (PTSD) symptoms, but the mechanisms underlying their relationships remain unclear. This study explores these relationships by examining the perceived frequency of supportive and countersupportive interactions with a significant other in PTSD patients. Ninety-six participants with PTSD were recruited and completed questionnaires assessing social interactions, ways of coping, and PTSD symptoms. Associations of social interactions (r2 = 4.1%–7.9%, p < .05) and coping (r2 = 15.9%– 16.5%, p < .001) with symptoms were independent, and suggested a direct association between social interactions and PTSD. Countersupportive interactions were more associated to symptoms than supportive interactions. Our findings suggest the development of psychotherapies that integrate social support interventions.
Resumo:
L’objectif général de la thèse était d’élargir les connaissances scientifiques sur le sommeil des enfants. La thèse est composée de quatre articles empiriques. Le premier visait à estimer la validité de l’actigraphie comme mesure de sommeil chez les enfants d’âge préscolaire en la comparant à la polysomnographie, et à examiner si son emplacement influence sa validité. 12 enfants âgés de 2 à 5 ans ont porté simultanément un actigraphe à la cheville et un au poignet pendant une nuit d’enregistrement polysomnographique. Les résultats démontrent que l’actigraphie permet une bonne détection du sommeil, mais qu’elle détecte moins bien l’éveil. Cet article suggère également que les jeunes enfants nécessitent un algorithme adapté à leur niveau d’activité nocturne. Enfin, la validité de l'actigraphie semble similaire pour le poignet et la cheville. Le deuxième article visait à comparer trois mesures de sommeil souvent utilisées avec de jeunes enfants, soit les agendas de sommeil, l’échelle des problèmes de sommeil du Child Behavior Checklist (CBCL) et l’actigraphie, afin de déterminer leurs similarités et leurs divergences quant aux variables de sommeil qui en sont dérivées. 80 familles ont participé à cette étude lorsque les enfants étaient âgés de 2 ans. Les enfants ont porté un actigraphe durant 72 heures consécutives et les mères ont complété un agenda de sommeil durant cette même période. Les deux parents ont aussi rempli le CBCL. Les résultats démontrent que ces mesures de sommeil évaluent des aspects différents du sommeil de l’enfant, et suggèrent une concordance particulièrement faible entre les mesures subjectives et objectives. Le troisième article visait à évaluer l’apport unique de la sécurité d’attachement dans la prédiction du sommeil de l’enfant. 62 dyades mère-enfant ont été rencontrées à deux reprises. La sécurité d’attachement et la dépendance ont été évaluées par observation à l’aide du Q-Sort d’attachement lorsque l’enfant avait 15 mois. À l’âge de 2 ans, les enfants ont porté un actigraphe durant 3 jours consécutifs. Les résultats indiquent que la sécurité d'attachement a une contribution unique à la prédiction de la durée du sommeil nocturne et de l'efficacité du sommeil nocturne. Ainsi, cette étude suggère que plus les enfants ont un attachement sécurisant envers leur mère, plus grandes sont la durée et la qualité de leur sommeil quelques mois plus tard. Le quatrième article visait à examiner la relation entre le sommeil et les comportements extériorisés. 64 familles ont participé à cette étude. À l’âge de 2 ans, les enfants ont porté un actigraphe durant 72 heures consécutives et les parents ont complété le CBCL. Lorsque les enfants étaient âgés de 4 ans, les parents ainsi que l’éducateur(trice) de garderie ont rempli le CBCL. Les résultats démontrent que le sommeil de l’enfant est associé aux comportements extériorisés concomitants et à l’augmentation de ceux-ci à travers le temps. Par ailleurs, les relations entre la qualité de sommeil et les comportements extériorisés étaient modérées par le sexe de l’enfant, c’est-à-dire significatives seulement chez les garçons. Les résultats des quatre articles sont finalement intégrés dans la discussion générale.
Resumo:
The Doctoral thesis focuses on the factors that influence the weather and climate over Peninsular Indias. The first chapter provides a general introduction about the climatic features over peninsular India, various factors dealt in subsequent chapters, such as solar forcing on climate, SST variability in the northern Indian Ocean and its influence on Indian monsoon, moisture content of the atmosphere and its importance in the climate system, empirical formulation of regression forecast of climate and some aspects of regional climate modeling. Chapter 2 deals with the variability in the vertically integrated moisture (VIM) over Peninsular India on various time scales. The third Chapter discusses the influence of solar activity in the low frequency variability in the rainfall of Peninsular India. The study also investigates the influence of solar activity on the horizontal and vertical components of wind and the difference in the forcing before and after the so-called regime shift in the climate system before and after mid-1970s.In Chapter 4 on Peninsular Indian Rainfall and its association with meteorological and oceanic parameters over adjoining oceanic region, a linear regression model was developed and tested for the seasonal rainfall prediction of Peninsular India.
Resumo:
Data mining is one of the hottest research areas nowadays as it has got wide variety of applications in common man’s life to make the world a better place to live. It is all about finding interesting hidden patterns in a huge history data base. As an example, from a sales data base, one can find an interesting pattern like “people who buy magazines tend to buy news papers also” using data mining. Now in the sales point of view the advantage is that one can place these things together in the shop to increase sales. In this research work, data mining is effectively applied to a domain called placement chance prediction, since taking wise career decision is so crucial for anybody for sure. In India technical manpower analysis is carried out by an organization named National Technical Manpower Information System (NTMIS), established in 1983-84 by India's Ministry of Education & Culture. The NTMIS comprises of a lead centre in the IAMR, New Delhi, and 21 nodal centres located at different parts of the country. The Kerala State Nodal Centre is located at Cochin University of Science and Technology. In Nodal Centre, they collect placement information by sending postal questionnaire to passed out students on a regular basis. From this raw data available in the nodal centre, a history data base was prepared. Each record in this data base includes entrance rank ranges, reservation, Sector, Sex, and a particular engineering. From each such combination of attributes from the history data base of student records, corresponding placement chances is computed and stored in the history data base. From this data, various popular data mining models are built and tested. These models can be used to predict the most suitable branch for a particular new student with one of the above combination of criteria. Also a detailed performance comparison of the various data mining models is done.This research work proposes to use a combination of data mining models namely a hybrid stacking ensemble for better predictions. A strategy to predict the overall absorption rate for various branches as well as the time it takes for all the students of a particular branch to get placed etc are also proposed. Finally, this research work puts forward a new data mining algorithm namely C 4.5 * stat for numeric data sets which has been proved to have competent accuracy over standard benchmarking data sets called UCI data sets. It also proposes an optimization strategy called parameter tuning to improve the standard C 4.5 algorithm. As a summary this research work passes through all four dimensions for a typical data mining research work, namely application to a domain, development of classifier models, optimization and ensemble methods.
Resumo:
This study examines the behavioral factors that influence the Indian Investors to invest in the Real Estate Market. Among the various factors that affect the tendency of investors to invest in the real market, certain factors are greatly influenced the investors at greatest extend while others at least level. From this study it is revealed that motivation from the real estate developers and brokers (mean value- 3.46) is most influencing factor and happening of uncertain events (mean value- 1.75) is the least factor that influences the investors’ investment behavior. In this study, the behavioral factor like over confidence and the hypotheses regarding education, religion were analyzed and found that religious factor influences the Indian investors to invest in the real estate
Resumo:
Cattle feed industry is a major segment of animal feed industry. This industry is gradually evolving into an organized sector and the feed manufactures are increasingly using modern and sophisticated methods that seek to incorporate best global practices. This industry has got high potential for growth in India, given the fact that the country is the world’s leading producer of milk and its production is expected to grow at a compounded annual growth rate of 4 per cent. Besides, the concept of branded cattle feed as a packaged commodity is fast gaining popularity in rural India. There can be a positive change in the demand for cattle feed because of factors like (i) shrinkage of open land for cattle grazing, urbanization and resultant shortage of conventionally used cattle feeds, and (ii) introduction of high yield cattle requires specialized feeds. Earlier research studies done by the present authors have revealed the significant growth prospects of the branded cattle feed industry, the feed consumption pattern and the relatively high share of branded feeds, feed consumption pattern based on product types (like, pellet and mash), composition of cattle feed market and the relatively large shares of Kerala Feeds Ltd. (KFL) and Kerala Solvent Extractions Ltd. (KSE) brands, the major factors influencing the purchasing decisions etc. As a continuation of the earlier studies, this study makes a closer look into the significance of product types in the buyer behavior, level of awareness about the brand and its implications on purchasing decisions, and the brandshifting behavior and its determinants
Resumo:
Hat Stiffened Plates are used in composite ships and are gaining popularity in metallic ship construction due to its high strength-to-weight ratio. Light weight structures will result in greater payload, higher speeds, reduced fuel consumption and environmental emissions. Numerical Investigations have been carried out using the commercial Finite Element software ANSYS 12 to substantiate the high strength-to-weight ratio of Hat Stiffened Plates over other open section stiffeners which are commonly used in ship building. Analysis of stiffened plate has always been a matter of concern for the structural engineers since it has been rather difficult to quantify the actual load sharing between stiffeners and plating. Finite Element Method has been accepted as an efficient tool for the analysis of stiffened plated structure. Best results using the Finite Element Method for the analysis of thin plated structures are obtained when both the stiffeners and the plate are modeled using thin plate elements having six degrees of freedom per node. However, one serious problem encountered with this design and analysis process is that the generation of the finite element models for a complex configuration is time consuming and laborious. In order to overcome these difficulties two different methods viz., Orthotropic Plate Model and Superelement for Hat Stiffened Plate have been suggested in the present work. In the Orthotropic Plate Model geometric orthotropy is converted to material orthotropy i.e., the stiffeners are smeared and they vanish from the field of analysis and the structure can be analysed using any commercial Finite Element software which has orthotropic elements in its element library. The Orthotropic Plate Model developed has predicted deflection, stress and linear buckling load with sufficiently good accuracy in the case of all four edges simply supported boundary condition. Whereas, in the case of two edges fixed and other two edges simply supported boundary condition even though the stress has been predicted with good accuracy there has been large variation in the deflection predicted. This variation in the deflection predicted is because, for the Orthotropic Plate Model the rigidity is uniform throughout the plate whereas in the actual Hat Stiffened Plate the rigidity along the line of attachment of the stiffeners to the plate is large as compared to the unsupported portion of the plate. The Superelement technique is a method of treating a portion of the structure as if it were a single element even though it is made up of many individual elements. The Superelement has predicted the deflection and in-plane stress of Hat Stiffened Plate with sufficiently good accuracy for different boundary conditions. Formulation of Superelement for composite Hat Stiffened Plate has also been presented in the thesis. The capability of Orthotropic Plate Model and Superelement to handle typical boundary conditions and characteristic loads in a ship structure has been demonstrated through numerical investigations.
Resumo:
In China, the history of the establishment of the private housing market is pretty short. Actually in less then two decades, the market has grown from almost the scratch to playing an important role in the economy. A great achievement! But many problems also exist. They need to be properly addressed and solved. Price problem---simply put, housing price is too high--- is one of them, and this paper is focused on it. Three basic questions are posed, i.e. (1) how to judge the housing affordability? (2) why the housing price is so high? (3) how to solve the housing price problem. The paper pays particular attention to answering the second question. Except the numerous news reports and surveys show that most of the ordinary city dwellers complained about the high housing price, the mathematical means, the four ratios, are applied to judge the housing affordability in Shanghai and Shenzhen. The results are very clear that the price problem is severe. So why? Something is wrong with the price mechanism. This research shows that mainly these five factors contribute to the price problem: the housing reform, the housing development model, the unbalanced housing market, the housing project financing and the poor governmental management. Finally the paper puts forward five suggestions to solve the housing price problem in first-hand private Chinese housing market. They include: the establishment of real estate information system, the creation of specific price management department, the government price regulation, the property tax and the legalization of "cushion money".
Resumo:
In der psycholinguistischen Forschung ist die Annahme weitverbreitet, dass die Bewertung von Informationen hinsichtlich ihres Wahrheitsgehaltes oder ihrer Plausibilität (epistemische Validierung; Richter, Schroeder & Wöhrmann, 2009) ein strategischer, optionaler und dem Verstehen nachgeschalteter Prozess ist (z.B. Gilbert, 1991; Gilbert, Krull & Malone, 1990; Gilbert, Tafarodi & Malone, 1993; Herbert & Kübler, 2011). Eine zunehmende Anzahl an Studien stellt dieses Zwei-Stufen-Modell von Verstehen und Validieren jedoch direkt oder indirekt in Frage. Insbesondere Befunde zu Stroop-artigen Stimulus-Antwort-Kompatibilitätseffekten, die auftreten, wenn positive und negative Antworten orthogonal zum aufgaben-irrelevanten Wahrheitsgehalt von Sätzen abgegeben werden müssen (z.B. eine positive Antwort nach dem Lesen eines falschen Satzes oder eine negative Antwort nach dem Lesen eines wahren Satzes; epistemischer Stroop-Effekt, Richter et al., 2009), sprechen dafür, dass Leser/innen schon beim Verstehen eine nicht-strategische Überprüfung der Validität von Informationen vornehmen. Ausgehend von diesen Befunden war das Ziel dieser Dissertation eine weiterführende Überprüfung der Annahme, dass Verstehen einen nicht-strategischen, routinisierten, wissensbasierten Validierungsprozesses (epistemisches Monitoring; Richter et al., 2009) beinhaltet. Zu diesem Zweck wurden drei empirische Studien mit unterschiedlichen Schwerpunkten durchgeführt. Studie 1 diente der Untersuchung der Fragestellung, ob sich Belege für epistemisches Monitoring auch bei Informationen finden lassen, die nicht eindeutig wahr oder falsch, sondern lediglich mehr oder weniger plausibel sind. Mithilfe des epistemischen Stroop-Paradigmas von Richter et al. (2009) konnte ein Kompatibilitätseffekt von aufgaben-irrelevanter Plausibilität auf die Latenzen positiver und negativer Antworten in zwei unterschiedlichen experimentellen Aufgaben nachgewiesen werden, welcher dafür spricht, dass epistemisches Monitoring auch graduelle Unterschiede in der Übereinstimmung von Informationen mit dem Weltwissen berücksichtigt. Darüber hinaus belegen die Ergebnisse, dass der epistemische Stroop-Effekt tatsächlich auf Plausibilität und nicht etwa auf der unterschiedlichen Vorhersagbarkeit von plausiblen und unplausiblen Informationen beruht. Das Ziel von Studie 2 war die Prüfung der Hypothese, dass epistemisches Monitoring keinen evaluativen Mindset erfordert. Im Gegensatz zu den Befunden anderer Autoren (Wiswede, Koranyi, Müller, Langner, & Rothermund, 2013) zeigte sich in dieser Studie ein Kompatibilitätseffekt des aufgaben-irrelevanten Wahrheitsgehaltes auf die Antwortlatenzen in einer vollständig nicht-evaluativen Aufgabe. Die Ergebnisse legen nahe, dass epistemisches Monitoring nicht von einem evaluativen Mindset, möglicherweise aber von der Tiefe der Verarbeitung abhängig ist. Studie 3 beleuchtete das Verhältnis von Verstehen und Validieren anhand einer Untersuchung der Online-Effekte von Plausibilität und Vorhersagbarkeit auf Augenbewegungen beim Lesen kurzer Texte. Zusätzlich wurde die potentielle Modulierung dieser Effeke durch epistemische Marker, die die Sicherheit von Informationen anzeigen (z.B. sicherlich oder vielleicht), untersucht. Entsprechend der Annahme eines schnellen und nicht-strategischen epistemischen Monitoring-Prozesses zeigten sich interaktive Effekte von Plausibilität und dem Vorhandensein epistemischer Marker auf Indikatoren früher Verstehensprozesse. Dies spricht dafür, dass die kommunizierte Sicherheit von Informationen durch den Monitoring-Prozess berücksichtigt wird. Insgesamt sprechen die Befunde gegen eine Konzeptualisierung von Verstehen und Validieren als nicht-überlappenden Stufen der Informationsverarbeitung. Vielmehr scheint eine Bewertung des Wahrheitsgehalts oder der Plausibilität basierend auf dem Weltwissen – zumindest in gewissem Ausmaß – eine obligatorische und nicht-strategische Komponente des Sprachverstehens zu sein. Die Bedeutung der Befunde für aktuelle Modelle des Sprachverstehens und Empfehlungen für die weiterführende Forschung zum Vehältnis von Verstehen und Validieren werden aufgezeigt.
Resumo:
In the past decades since Schumpeter’s influential writings economists have pursued research to examine the role of innovation in certain industries on firm as well as on industry level. Researchers describe innovations as the main trigger of industry dynamics, while policy makers argue that research and education are directly linked to economic growth and welfare. Thus, research and education are an important objective of public policy. Firms and public research are regarded as the main actors which are relevant for the creation of new knowledge. This knowledge is finally brought to the market through innovations. What is more, policy makers support innovations. Both actors, i.e. policy makers and researchers, agree that innovation plays a central role but researchers still neglect the role that public policy plays in the field of industrial dynamics. Therefore, the main objective of this work is to learn more about the interdependencies of innovation, policy and public research in industrial dynamics. The overarching research question of this dissertation asks whether it is possible to analyze patterns of industry evolution – from evolution to co-evolution – based on empirical studies of the role of innovation, policy and public research in industrial dynamics. This work starts with a hypothesis-based investigation of traditional approaches of industrial dynamics. Namely, the testing of a basic assumption of the core models of industrial dynamics and the analysis of the evolutionary patterns – though with an industry which is driven by public policy as example. Subsequently it moves to a more explorative approach, investigating co-evolutionary processes. The underlying questions of the research include the following: Do large firms have an advantage because of their size which is attributable to cost spreading? Do firms that plan to grow have more innovations? What role does public policy play for the evolutionary patterns of an industry? Are the same evolutionary patterns observable as those described in the ILC theories? And is it possible to observe regional co-evolutionary processes of science, innovation and industry evolution? Based on two different empirical contexts – namely the laser and the photovoltaic industry – this dissertation tries to answer these questions and combines an evolutionary approach with a co-evolutionary approach. The first chapter starts with an introduction of the topic and the fields this dissertation is based on. The second chapter provides a new test of the Cohen and Klepper (1996) model of cost spreading, which explains the relationship between innovation, firm size and R&D, at the example of the photovoltaic industry in Germany. First, it is analyzed whether the cost spreading mechanism serves as an explanation for size advantages in this industry. This is related to the assumption that the incentives to invest in R&D increase with the ex-ante output. Furthermore, it is investigated whether firms that plan to grow will have more innovative activities. The results indicate that cost spreading serves as an explanation for size advantages in this industry and, furthermore, growth plans lead to higher amount of innovative activities. What is more, the role public policy plays for industry evolution is not finally analyzed in the field of industrial dynamics. In the case of Germany, the introduction of demand inducing policy instruments stimulated market and industry growth. While this policy immediately accelerated market volume, the effect on industry evolution is more ambiguous. Thus, chapter three analyzes this relationship by considering a model of industry evolution, where demand-inducing policies will be discussed as a possible trigger of development. The findings suggest that these instruments can take the same effect as a technical advance to foster the growth of an industry and its shakeout. The fourth chapter explores the regional co-evolution of firm population size, private-sector patenting and public research in the empirical context of German laser research and manufacturing over more than 40 years from the emergence of the industry to the mid-2000s. The qualitative as well as quantitative evidence is suggestive of a co-evolutionary process of mutual interdependence rather than a unidirectional effect of public research on private-sector activities. Chapter five concludes with a summary, the contribution of this work as well as the implications and an outlook of further possible research.
Resumo:
Impressive claims have been made for the performance of the SNoW algorithm on face detection tasks by Yang et. al. [7]. In particular, by looking at both their results and those of Heisele et. al. [3], one could infer that the SNoW system performed substantially better than an SVM-based system, even when the SVM used a polynomial kernel and the SNoW system used a particularly simplistic 'primitive' linear representation. We evaluated the two approaches in a controlled experiment, looking directly at performance on a simple, fixed-sized test set, isolating out 'infrastructure' issues related to detecting faces at various scales in large images. We found that SNoW performed about as well as linear SVMs, and substantially worse than polynomial SVMs.