64 resultados para Underwater foundations
Resumo:
La présente thèse analyse les facteurs expliquant l'attrait académique pour la Near-Death Experience (NDE) aux Etats-Unis. À travers l'étude d'un acteur clé, Russell Noyés, la thèse démontre que l'expérience de mort imminente est devenue pertinente grâce aux psychiatres et aux psychologues qui lui ont octroyé une qualité clinique et thérapeutique. Afin de reconstruire les fondements académiques de l'intérêt pour l'étude de la mort, de l'expérience de la mort et de la mort imminente en psychiatrie dans les années 1960-70, la méthode employée est celle d'une enquête historique qui combine une approche empirique avec une génétique de récit. - The dissertation analyses the factors explaining the emergence of the Near- Death Experience (NDE) as a research topic in the disciplines of psychology and psychiatry in the United States. The study of a key actor, Russell Noyes, will demonstrate how the experience of near-death became relevant through the work of psychiatrists and psychologists who attributed a clinical and therapeutical quality to it. In order to retrace the academic foundations of the research on death, the experience of dying and near-death, during the 60s and 70s, this dissertation applies a historical method, which combines an empirical approach with a genetic of narratives.
Resumo:
After 13 days of weight maintenance diet (13,720 +/- 620 kJ/day, 40% fat, 15% protein, and 45% carbohydrate), five young men (71.3 +/- 7.1 kg, 181 +/- 8 cm; means +/- SD) were overfed for 9 days at 1.6 times their maintenance requirements (i.e., +8,010 kJ/day). Twenty-four-hour energy expenditure (24-h EE) and basal metabolic rate (BMR) were measured on three occasions, once after 10 days on the weight-maintenance diet and after 2 and 9 days of overfeeding. Physical activity was monitored throughout the study, body composition was measured by underwater weighing, and nitrogen balance was assessed for 3 days during the two experimental periods. Overfeeding caused an increase in body weight averaging 3.2 kg of which 56% was fat as measured by underwater weighing. After 9 days of overfeeding, BMR increased by 622 kJ/day, which could explain one-third of the increase in 24-h EE (2,038 kJ/day); the remainder was due to the thermic effect of food (which increased in proportion with excess energy intake) and the increased cost of physical activity, related to body weight gain. This study shows that approximately one-quarter of the excess energy intake was dissipated through an increase in EE, with 75% being stored in the body. Under our experimental conditions of mixed overfeeding in which body composition measurements were combined with those of energy balance, it was possible to account for all of the energy ingested in excess of maintenance requirements.
Resumo:
Structure of the Thesis This thesis consists of 5 sections. Section 1 starts with the problem definition and the presentation of the objectives of this thesis. Section 2 introduces a presentation of the theoretical foundations of Venture financing and a review of the main theories developed on Venture investing. It includes a taxonomy of contracting clauses relevant in venture contracting, the conflicts they address, and presents some general observations on contractual clauses. Section 3 presents the research findings on the analysis of a European VC's deal flow and investment screening linked to the prevailing market conditions. Section 4 focuses an empirical study of a European VC's investment process, the criteria it uses to make its investments. It presents empirical findings on the investment criteria over time, business cycles, and investment types. It also links these criteria to the VC's subsequent performance. Finally, section 5 presents an empirical research on the comparison of the legal contracts signed between European and United States Venture Capitalists and the companies they finance. This research highlights some of the contracting practices in Europe and the United States.
Resumo:
Career interventions for adults frequently include personality assessment. Personality in career counseling contexts should no longer be considered as vocational personality associated with personality interests but, rather, as a set of dispositions that has an impact on several vocational and career-related outcomes, such as work engagement, work satisfaction, job performance, etc. Although the relationship between personality and the vocational and career related outcomes is not direct, it might certainly be mediated by several regulatory processes, such as work adaptability, and moderated by contextual and environmental factors. Personality assessment initiates an individual's self-regulatory process and contributes to the overall effectiveness of career interventions when feedback is individualized and stimulates a deconstruction, reconstruction, and co-construction of the vocational or multiple self-concept. Personality assessments can also promote the reconstruction of a self-concept more aligned with the perception of the environment about the personality of the counselee, strengthening the reality principle allowing more rational and controlled choices. In addition, some specific personality profiles, such as having high levels of neuroticism and low levels of conscientiousness, can be considered as risk factors frequently leading to career decision-making difficulties. Moreover, people with low conscientiousness benefit less from career interventions, so special attention should be devoted to counselees having that characteristic. Two case studies are provided to illustrate these important aspects of personality assessment in career interventions.
Resumo:
ABSTRACT: BACKGROUND: Decision curve analysis has been introduced as a method to evaluate prediction models in terms of their clinical consequences if used for a binary classification of subjects into a group who should and into a group who should not be treated. The key concept for this type of evaluation is the "net benefit", a concept borrowed from utility theory. METHODS: We recall the foundations of decision curve analysis and discuss some new aspects. First, we stress the formal distinction between the net benefit for the treated and for the untreated and define the concept of the "overall net benefit". Next, we revisit the important distinction between the concept of accuracy, as typically assessed using the Youden index and a receiver operating characteristic (ROC) analysis, and the concept of utility of a prediction model, as assessed using decision curve analysis. Finally, we provide an explicit implementation of decision curve analysis to be applied in the context of case-control studies. RESULTS: We show that the overall net benefit, which combines the net benefit for the treated and the untreated, is a natural alternative to the benefit achieved by a model, being invariant with respect to the coding of the outcome, and conveying a more comprehensive picture of the situation. Further, within the framework of decision curve analysis, we illustrate the important difference between the accuracy and the utility of a model, demonstrating how poor an accurate model may be in terms of its net benefit. Eventually, we expose that the application of decision curve analysis to case-control studies, where an accurate estimate of the true prevalence of a disease cannot be obtained from the data, is achieved with a few modifications to the original calculation procedure. CONCLUSIONS: We present several interrelated extensions to decision curve analysis that will both facilitate its interpretation and broaden its potential area of application.
Resumo:
PURPOSE: The aim of this study was to develop models based on kernel regression and probability estimation in order to predict and map IRC in Switzerland by taking into account all of the following: architectural factors, spatial relationships between the measurements, as well as geological information. METHODS: We looked at about 240,000 IRC measurements carried out in about 150,000 houses. As predictor variables we included: building type, foundation type, year of construction, detector type, geographical coordinates, altitude, temperature and lithology into the kernel estimation models. We developed predictive maps as well as a map of the local probability to exceed 300 Bq/m(3). Additionally, we developed a map of a confidence index in order to estimate the reliability of the probability map. RESULTS: Our models were able to explain 28% of the variations of IRC data. All variables added information to the model. The model estimation revealed a bandwidth for each variable, making it possible to characterize the influence of each variable on the IRC estimation. Furthermore, we assessed the mapping characteristics of kernel estimation overall as well as by municipality. Overall, our model reproduces spatial IRC patterns which were already obtained earlier. On the municipal level, we could show that our model accounts well for IRC trends within municipal boundaries. Finally, we found that different building characteristics result in different IRC maps. Maps corresponding to detached houses with concrete foundations indicate systematically smaller IRC than maps corresponding to farms with earth foundation. CONCLUSIONS: IRC mapping based on kernel estimation is a powerful tool to predict and analyze IRC on a large-scale as well as on a local level. This approach enables to develop tailor-made maps for different architectural elements and measurement conditions and to account at the same time for geological information and spatial relations between IRC measurements.
Resumo:
Independent regulatory agencies are the institutional foundations of the regulatory state that, during the past 15 years, has gained prominence throughout Europe. This article studies the rise of independent authorities in European countries by comparing regulatory agencies and central banks. Delegation to independent central banks and to independent regulatory agencies is similar in many respects. In both cases, agents are deliberately made independent from political principals through a specific institutional design. Moreover, it has been argued that delegation to both central banks and regulatory agencies is linked to the need for policy-makers to improve the credibility of policy commitments, to the wish of incumbent politicians to tie the hands of future majorities, and to the extent to which the institutional contexts safeguard policy stability. Through an analysis of the formal independence of central banks and regulatory agencies in Western Europe, this article identifies an empirical puzzle that casts doubts on the accuracy of current explanations. Veto players and the uncertainty of incumbent policy-makers in respect to their re-election prospects matter for delegation to both central banks and regulatory agencies, but in opposite ways. Making sense of these anomalies is necessary to achieve a better understanding of delegation to independent authorities.
Resumo:
OBJECTIVES: This study investigated the relationship between inter-arm coordination and the energy cost of locomotion in front crawl and breaststroke and explored swimmers' flexibility in adapting their motor organization away from their preferred movement pattern. DESIGN: Nine front-crawlers performed three 300-m in front crawl and 8 breaststrokers performed three 200-m in breaststroke at constant submaximal intensity and with 5-min rests. Each trial was performed randomly in a different coordination pattern: freely chosen, 'maximal glide' and 'minimal glide'. Two underwater cameras videotaped frontal and side views to analyze speed, stroke rate, stroke length and inter-limb coordination. METHODS: In front crawl, inter-arm coordination was quantified by the index of coordination (IdC) and the leg beat kicks were counted. In breaststroke, four time gaps quantified the arm to leg coordination (i.e., time between leg and arm propulsions; time between beginning, 90° flexion and end of arm and leg recoveries). The energy cost of locomotion was calculated from gas exchanges and blood lactate concentration. RESULTS: In both front crawl and breaststroke, the freely chosen coordination corresponded to glide pattern and showed the lowest energy cost (12.8 and 17.1Jkg(-1)m(-1), respectively). Both front-crawlers and breaststrokers were able to reach 'maximal glide' condition (respectively, +35% and +28%) but not 'minimal glide' condition for front crawl. CONCLUSIONS: The freely chosen pattern appeared more economic because more trained. When coordination was constrained, the swimmers had higher coordination flexibility in breaststroke than in front crawl, suggesting that breaststroke coordination was easier to regulate by changing glide time.
Resumo:
Heretofore the issue of quality in forensic science is approached through a quality management policy whose tenets are ruled by market forces. Despite some obvious advantages of standardization of methods allowing interlaboratory comparisons and implementation of databases, this approach suffers from a serious lack of consideration for forensic science as a science. A critical study of its principles and foundations, which constitutes its culture, enables to consider the matter of scientific quality through a new dimension. A better understanding of what pertains to forensic science ensures a better application and improves elementary actions within the investigative and intelligence processes as well as the judicial process. This leads to focus the attention on the core of the subject matter: the physical remnants of the criminal activity, namely, the traces that produce information in understanding this activity. Adapting practices to the detection and recognition of relevant traces relies on the apprehension of the processes underlying forensic science tenets (Locard, Kirk, relevancy issue) and a structured management of circumstantial information (directindirect information). This is influenced by forensic science education and training. However, the lack of homogeneity with regard to the scientific nature and culture of the discipline within forensic science practitioners and partners represents a real challenge. A sound and critical reconsideration of the forensic science practitioner's roles (investigator, evaluator, intelligence provider) and objectives (prevention, strategies, evidence provider) within the criminal justice system is a means to strengthen the understanding and the application of forensic science. Indeed, the whole philosophy is aimed at ensuring a high degree of excellence, namely, a dedicated scientific quality.
Resumo:
This study investigated behavioral adaptability, which could be defined as a blend between stability and flexibility of the limbs movement and their inter-limb coordination, when individuals received informational constraints. Seven expert breaststroke swimmers performed three 200-m in breaststroke at constant submaximal intensity. Each trial was performed randomly in a different coordination pattern: 'freely-chosen', 'maximal glide' and 'minimal glide'. Two underwater and four aerial cameras enabled 3D movement analysis in order to assess elbow and knee angles, elbow-knee pair coordination, intra-cyclic velocity variations of the center of mass, stroke rate and stroke length and inter-limb coordination. The energy cost of locomotion was calculated from gas exchanges and blood lactate concentration. The results showed significantly higher glide, intra-cyclic velocity variations and energy cost under 'maximal glide' compared to 'freely-chosen' instructional conditions, as well as higher reorganization of limb movement and inter-limb coordination (p<0.05). In the 'minimal glide' condition, the swimmers did not show significantly shorter glide and lower energy cost, but they exhibited significantly lower deceleration of the center of mass, as well as modified limb movement and inter-limb coordination (p<0.05). These results highlight that a variety of structural adaptations can functionally satisfy the task-goal.