870 resultados para assessment data
Resumo:
BACKGROUND: Only few countries have cohorts enabling specific and up-to-date cardiovascular disease (CVD) risk estimation. Individual risk assessment based on study samples that differ too much from the target population could jeopardize the benefit of risk charts in general practice. Our aim was to provide up-to-date and valid CVD risk estimation for a Swiss population using a novel record linkage approach. METHODS: Anonymous record linkage was used to follow-up (for mortality, until 2008) 9,853 men and women aged 25-74 years who participated in the Swiss MONICA (MONItoring of trends and determinants in CVD) study of 1983-92. The linkage success was 97.8%, loss to follow-up 1990-2000 was 4.7%. Based on the ESC SCORE methodology (Weibull regression), we used age, sex, blood pressure, smoking, and cholesterol to generate three models. We compared the 1) original SCORE model with a 2) recalibrated and a 3) new model using the Brier score (BS) and cross-validation. RESULTS: Based on the cross-validated BS, the new model (BS = 14107×10(-6)) was somewhat more appropriate for risk estimation than the original (BS = 14190×10(-6)) and the recalibrated (BS = 14172×10(-6)) model. Particularly at younger age, derived absolute risks were consistently lower than those from the original and the recalibrated model which was mainly due to a smaller impact of total cholesterol. CONCLUSION: Using record linkage of observational and routine data is an efficient procedure to obtain valid and up-to-date CVD risk estimates for a specific population.
Resumo:
Homology modeling is the most commonly used technique to build a three-dimensional model for a protein sequence. It heavily relies on the quality of the sequence alignment between the protein to model and related proteins with a known three dimensional structure. Alignment quality can be assessed according to the physico-chemical properties of the three dimensional models it produces.In this work, we introduce fifteen predictors designed to evaluate the properties of the models obtained for various alignments. They consist of an energy value obtained from different force fields (CHARMM, ProsaII or ANOLEA) computed on residue selected around misaligned regions. These predictors were evaluated on ten challenging test cases. For each target, all possible ungapped alignments are generated and their corresponding models are computed and evaluated.The best predictor, retrieving the structural alignment for 9 out of 10 test cases, is based on the ANOLEA atomistic mean force potential and takes into account residues around misaligned secondary structure elements. The performance of the other predictors is significantly lower. This work shows that substantial improvement in local alignments can be obtained by careful assessment of the local structure of the resulting models.
Resumo:
On yleisesti tiedossa, että väsyttävän kuormituksen alaisena olevat hitsatut rakenteet rikkoutuvat juuri hitsausliitoksista. Täyden tunkeuman hitsausliitoksia sisältävien rakenteiden asiantunteva suunnittelu janykyaikaiset valmistusmenetelmät ovat lähes eliminoineet väsymisvauriot hitsatuissa rakenteissa. Väsymislujuuden parantaminen tiukalla täyden tunkeuman vaatimuksella on kuitenkin epätaloudellinen ratkaisu. Täyden tunkeuman hitsausliitoksille asetettavien laatuvaatimuksien on määriteltävä selkeät tarkastusohjeet ja hylkäämisperusteet. Tämän diplomityön tarkoituksena oli tutkia geometristen muuttujien vaikutusta kuormaa kantavien hitsausliitosten väsymislujuuteen. Huomio kiinnitettiin pääasiassa suunnittelumuuttujiin, joilla on vaikutusta väsymisvaurioiden syntymiseen hitsauksen juuren puolella. Nykyiset määräykset ja standardit, jotka perustuvat kokeellisiin tuloksiin; antavat melko yleisiä ohjeita hitsausliitosten väsymismitoituksesta. Tämän vuoksi muodostettiin kokonaan uudet parametriset yhtälöt sallitun nimellisen jännityksen kynnysarvon vaihteluvälin, ¿¿th, laskemiseksi, jotta vältettäisiin hitsausliitosten juuren puoleiset väsymisvauriot. Lisäksi, jokaiselle liitostyypille laskettiin hitsin juuren puolen väsymisluokat (FAT), joita verrattiin olemassa olevilla mitoitusohjeilla saavutettuihin tuloksiin. Täydentäviksi referensseiksi suoritettiin useita kolmiulotteisia (3D) analyysejä. Julkaistuja kokeellisiin tuloksiin perustuvia tietoja käytettiin apuna hitsausliitosten väsymiskäyttäytymisen ymmärtämiseksi ja materiaalivakioiden määrittämiseksi. Kuormaa kantavien vajaatunkeumaisten hitsausliitosten väsymislujuus määritettiin käyttämällä elementtimenetelmää. Suurimman pääjännityksen kriteeriä hyödynnettiin murtumiskäyttäytymisen ennakoimiseksi. Valitulle hitsatulle materiaalille ja koeolosuhteille murtumiskäyttäytymistä mallinnettiin särön kasvunopeudella da/dN ja jännitysintensiteettikertoimen vaihteluvälillä, 'K. Paris:n yhtälön numeerinen integrointi suoritettiin FRANC2D/L tietokoneohjelmalla. Saatujen tulosten perusteella voidaan laskea FAT tutkittavassa tapauksessa. ¿¿th laskettiin alkusärön jännitysintensiteettikertoimen vaihteluvälin ja kynnysjännitysintensiteettikertoimen, 'Kth, perusteella. ¿Kth arvoa pienemmällä vaihteluvälillä särö ei kasva. Analyyseissäoletuksena oli hitsattu jälkikäsittelemätön liitos, jossa oli valmis alkusärö hitsin juuressa. Analyysien tulokset ovat hyödyllisiä suunnittelijoille, jotka tekevät päätöksiä koskien geometrisiä parametreja, joilla on vaikutusta hitsausliitosten väsymislujuuteen.
Resumo:
ABSTRACT:¦BACKGROUND: The Spiritual Distress Assessment Tool (SDAT) is a 5-item instrument developed to assess unmet spiritual needs in hospitalized elderly patients and to determine the presence of spiritual distress. The objective of this study was to investigate the SDAT psychometric properties.¦METHODS: This cross-sectional study was performed in a Geriatric Rehabilitation Unit. Patients (N = 203), aged 65 years and over with Mini Mental State Exam score ≥ 20, were consecutively enrolled over a 6-month period. Data on health, functional, cognitive, affective and spiritual status were collected upon admission. Interviews using the SDAT (score from 0 to 15, higher scores indicating higher distress) were conducted by a trained chaplain. Factor analysis, measures of internal consistency (inter-item and item-to-total correlations, Cronbach α), and reliability (intra-rater and inter-rater) were performed. Criterion-related validity was assessed using the Functional Assessment of Chronic Illness Therapy-Spiritual well-being (FACIT-Sp) and the question "Are you at peace?" as criterion-standard. Concurrent and predictive validity were assessed using the Geriatric Depression Scale (GDS), occurrence of a family meeting, hospital length of stay (LOS) and destination at discharge.¦RESULTS: SDAT scores ranged from 1 to 11 (mean 5.6 ± 2.4). Overall, 65.0% (132/203) of the patients reported some spiritual distress on SDAT total score and 22.2% (45/203) reported at least one severe unmet spiritual need. A two-factor solution explained 60% of the variance. Inter-item correlations ranged from 0.11 to 0.41 (eight out of ten with P < 0.05). Item-to-total correlations ranged from 0.57 to 0.66 (all P < 0.001). Cronbach α was acceptable (0.60). Intra-rater and inter-rater reliabilities were high (Intraclass Correlation Coefficients ranging from 0.87 to 0.96). SDAT correlated significantly with the FACIT-Sp, "Are you at peace?", GDS (Rho -0.45, -0.33, and 0.43, respectively, all P < .001), and LOS (Rho 0.15, P = .03). Compared with patients showing no severely unmet spiritual need, patients with at least one severe unmet spiritual need had higher odds of occurrence of a family meeting (adjOR 4.7, 95%CI 1.4-16.3, P = .02) and were more often discharged to a nursing home (13.3% vs 3.8%; P = .027).¦CONCLUSIONS: SDAT has acceptable psychometrics properties and appears to be a valid and reliable instrument to assess spiritual distress in elderly hospitalized patients.
Resumo:
This study aimed to develop a hip screening tool that combines relevant clinical risk factors (CRFs) and quantitative ultrasound (QUS) at the heel to determine the 10-yr probability of hip fractures in elderly women. The EPISEM database, comprised of approximately 13,000 women 70 yr of age, was derived from two population-based white European cohorts in France and Switzerland. All women had baseline data on CRFs and a baseline measurement of the stiffness index (SI) derived from QUS at the heel. Women were followed prospectively to identify incident fractures. Multivariate analysis was performed to determine the CRFs that contributed significantly to hip fracture risk, and these were used to generate a CRF score. Gradients of risk (GR; RR/SD change) and areas under receiver operating characteristic curves (AUC) were calculated for the CRF score, SI, and a score combining both. The 10-yr probability of hip fracture was computed for the combined model. Three hundred seven hip fractures were observed over a mean follow-up of 3.2 yr. In addition to SI, significant CRFs for hip fracture were body mass index (BMI), history of fracture, an impaired chair test, history of a recent fall, current cigarette smoking, and diabetes mellitus. The average GR for hip fracture was 2.10 per SD with the combined SI + CRF score compared with a GR of 1.77 with SI alone and of 1.52 with the CRF score alone. Thus, the use of CRFs enhanced the predictive value of SI alone. For example, in a woman 80 yr of age, the presence of two to four CRFs increased the probability of hip fracture from 16.9% to 26.6% and from 52.6% to 70.5% for SI Z-scores of +2 and -3, respectively. The combined use of CRFs and QUS SI is a promising tool to assess hip fracture probability in elderly women, especially when access to DXA is limited.
Resumo:
Static process simulation has traditionally been used to model complex processes for various purposes. However, the use of static processsimulators for the preparation of holistic examinations aiming at improving profit-making capability requires a lot of work because the production of results requires the assessment of the applicability of detailed data which may be irrelevant to the objective. The relevant data for the total assessment gets buried byirrelevant data. Furthermore, the models do not include an examination of the maintenance or risk management, and economic examination is often an extra property added to them which can be performed with a spreadsheet program. A process model applicable to holistic economic examinations has been developed in this work. The model is based on the life cycle profit philosophy developed by Hagberg and Henriksson in 1996. The construction of the model has utilized life cycle assessment and life cycle costing methodologies with a view to developing, above all, a model which would be applicable to the economic examinations of complete wholes and which would require the need for information focusing on aspects essential to the objectives. Life cycle assessment and costing differ from each other in terms of the modeling principles, but the features of bothmethodologies can be used in the development of economic process modeling. Methods applicable to the modeling of complex processes can be examined from the viewpoint of life cycle methodologies, because they involve the collection and management of large corpuses of information and the production of information for the needs of decision-makers as well. The results of the study shows that on the basis of the principles of life cycle modeling, a process model can be created which may be used to produce holistic efficiency examinations on the profit-making capability of the production line, with fewer resources thanwith traditional methods. The calculations of the model are based to the maximum extent on the information system of the factory, which means that the accuracyof the results can be improved by developing information systems so that they can provide the best information for this kind of examinations.
Resumo:
One aim of this study is to determine the impact of water velocity on the uptake of indicator polychlorinated biphenyls (iPCBs) by silicone rubber (SR) and low-density polyethylene (LDPE) passive samplers. A second aim is to assess the efficiency of performance reference compounds (PRCs) to correct for the impact of water velocity. SR and LDPE samplers were spiked with 11 or 12 PRCs and exposed for 6 weeks to four different velocities (in the range of 1.6 to 37.7 cm s−1) in river-like flow conditions using a channel system supplied with river water. A relationship between velocity and the uptakewas found for each iPCB and enables to determine expected changes in the uptake due to velocity variations. For both samplers, velocity increases from 2 to 10 cm s−1, 30 cm s−1 (interpolated data) and 100 cm s−1 (extrapolated data) lead to increases of the uptake which do not exceed a factor of 2, 3 and 4.5, respectively. Results also showed that the influence of velocity decreased with increasing the octanol-water coefficient partition (log Kow) of iPCBs when SR is used whereas the opposite effect was observed for LDPE. Time-weighted average (TWA) concentrations of iPCBs in water were calculated from iPCB uptake and PRC release. These calculations were performed using either a single PRC or all the PRCs. The efficiency of PRCs to correct the impact of velocity was assessed by comparing the TWA concentrations obtained at the four tested velocities. For SR, a good agreement was found among the four TWA concentrations with both methods (average RSD b 10%). Also for LDPE, PRCs offered a good correction of the impact of water velocity (average RSD of about 10 to 20%). These results contribute to the process of acceptance of passive sampling in routine regulatory monitoring programs.
Resumo:
BACKGROUND: In this study, we aimed at assessing Inflammatory Bowel Disease patients' needs and current nursing practice to investigate to what extent consensus statements (European Crohn's and Colitis Organization) on the nursing roles in caring for patients with IBD concur with local practice. METHODS: We used a mixed-method convergent design to combine quantitative data prospectively collected in the Swiss IBD cohort study and qualitative data from structured interviews with IBD healthcare experts. Symptoms, quality of life, and anxiety and depression scores were retrieved from physician charts and patient self-reported questionnaires. Descriptive analyses were performed based on quantitative and qualitative data. RESULTS: 230 patients of a single center were included, 60% of patients were males, and median age was 40 (range 18-85). The prevalence of abdominal pain was 42%. Self-reported data were obtained from 75 out of 230 patients. General health was perceived significantly lower compared with the general population (p < 0.001). Prevalence of tiredness was 73%; sleep problems, 78%; issues related to work, 20%; sexual constraints, 35%; diarrhea, 67%; being afraid of not finding a bathroom, 42%; depression, 11%; and anxiety symptoms, 23%. According to experts' interviews, the consensus statements are found mostly relevant with many recommendations that are not yet realized in clinical practice. CONCLUSION: Identified prevalence may help clinicians in detecting patients at risk and improve patient management. © 2015 S. Karger AG, Basel.
Resumo:
BACKGROUND: Pain assessment in mechanically ventilated patients is challenging, because nurses need to decode pain behaviour, interpret pain scores, and make appropriate decisions. This clinical reasoning process is inherent to advanced nursing practice, but is poorly understood. A better understanding of this process could contribute to improved pain assessment and management. OBJECTIVE: This study aimed to describe the indicators that influence expert nurses' clinical reasoning when assessing pain in critically ill nonverbal patients. METHODS: This descriptive observational study was conducted in the adult intensive care unit (ICU) of a tertiary referral hospital in Western Switzerland. A purposive sample of expert nurses, caring for nonverbal ventilated patients who received sedation and analgesia, were invited to participate in the study. Data were collected in "real life" using recorded think-aloud combined with direct non-participant observation and brief interviews. Data were analysed using deductive and inductive content analyses using a theoretical framework related to clinical reasoning and pain. RESULTS: Seven expert nurses with an average of 7.85 (±3.1) years of critical care experience participated in the study. The patients had respiratory distress (n=2), cardiac arrest (n=2), sub-arachnoid bleeding (n=1), and multi-trauma (n=2). A total of 1344 quotes in five categories were identified. Patients' physiological stability was the principal indicator for making decision in relation to pain management. Results also showed that it is a permanent challenge for nurses to discriminate situations requiring sedation from situations requiring analgesia. Expert nurses mainly used working knowledge and patterns to anticipate and prevent pain. CONCLUSIONS: Patient's clinical condition is important for making decision about pain in critically ill nonverbal patients. The concept of pain cannot be assessed in isolation and its assessment should take the patient's clinical stability and sedation into account. Further research is warranted to confirm these results.
Resumo:
Clinical experience and experimental data suggest that intradialytic hemodynamic profiles could be influenced by the characteristics of the dialysis membranes. Even within the worldwide used polysulfone family, intolerance to specific membranes was occasionally evoked. The aim of this study was to compare hemodynamically some of the commonly used polysulfone dialyzers in Switzerland. We performed an open-label, randomized, cross-over trial, including 25 hemodialysis patients. Four polysulfone dialyzers, A (Revaclear high-flux, Gambro, Stockholm, Sweden), B (Helixone high-flux, Fresenius), C (Xevonta high-flux, BBraun, Melsungen, Germany), and D (Helixone low-flux, Fresenius, Bad Homburg vor der Höhe, Germany), were compared. The hemodynamic profile was assessed and patients were asked to provide tolerance feedback. The mean score (±SD) subjectively assigned to dialysis quality on a 1-10 scale was A 8.4 ± 1.3, B 8.6 ± 1.3, C 8.5 ± 1.6, D 8.5 ± 1.5. Kt/V was A 1.58 ± 0.30, B 1.67 ± 0.33, C 1.62 ± 0.32, D 1.45 ± 0.31. The low- compared with the high-flux membranes, correlated to higher systolic (128.1 ± 13.1 vs. 125.6 ± 12.1 mmHg, P < 0.01) and diastolic (76.8 ± 8.7 vs. 75.3 ± 9.0 mmHg; P < 0.05) pressures, higher peripheral resistance (1.44 ± 0.19 vs. 1.40 ± 0.18 s × mmHg/mL; P < 0.05) and lower cardiac output (3.76 ± 0.62 vs. 3.82 ± 0.59 L/min; P < 0.05). Hypotension events (decrease in systolic blood pressure by >20 mmHg) were 70 with A, 87 with B, 73 with C, and 75 with D (P < 0.01 B vs. A, 0.05 B vs. C and 0.07 B vs. D). The low-flux membrane correlated to higher blood pressure levels compared with the high-flux ones. The Helixone high-flux membrane ensured the best efficiency. Unfortunately, the very same dialyzer correlated to a higher incidence of hypotensive episodes.
Resumo:
Introduction : Décrire les patients d'une structure gériatrique offrant des hospitalisations de courte durée, dans un contexte ambulatoire, pour des situations gériatriques courantes dans le canton de Genève (Suisse). Mesurer les performances de cette structure en termes de qualité des soins et de coûts. Méthodes : Des données relatives au profil des 100 premiers patients ont été collectées (huit mois), ainsi qu'aux prestations, aux ressources et aux effets (réadmissions, décès, satisfaction, complications) de manière à mesurer différents indicateurs de qualité et de coûts. Les valeurs observées ont été systématiquement comparées aux valeurs attendues, calculées à partir du profil des patients. Résultats : Des critères d'admission ont été fixés pour exclure les situations dans lesquelles d'autres structures offrent des soins mieux adaptés. La spécificité de cette structure intermédiaire a été d'assurer une continuité des soins et d'organiser d'emblée le retour à domicile par des prestations de liaison ambulatoire. La faible occurrence des réadmissions potentiellement évitables, une bonne satisfaction des patients, l'absence de décès prématurés et le faible nombre de complications suggèrent que les soins médicaux et infirmiers ont été délivrés avec une bonne qualité. Le coût s'est révélé nettement plus économique que des séjours hospitaliers après ajustement pour la lourdeur des cas. Conclusion : L'expérience-pilote a démontré la faisabilité et l'utilité d'une unité d'hébergement et d'hospitalisation de court séjour en toute sécurité. Le suivi du patient par le médecin traitant assure une continuité des soins et évite la perte d'information lors des transitions ainsi que les examens non pertinents. INTRODUCTION: To describe patients admitted to a geriatric institution, providing short-term hospitalizations in the context of ambulatory care in the canton of Geneva. To measure the performances of this structure in terms of quality ofcare and costs. METHOD: Data related to the clinical,functioning and participation profiles of the first 100 patients were collected. Data related to effects (readmission, deaths, satisfaction, complications), services and resources were also documented over an 8-month period to measure various quality and costindicators. Observed values were systematically compared to expected values, adjusted for case mix. RESULTS: Explicit criteria were proposed to focus on the suitable patients, excluding situations in which other structures were considered to be more appropriate. The specificity of this intermediate structure was to immediately organize, upon discharge, outpatient services at home. The low rate of potentially avoidable readmissions, the high patient satisfaction scores, the absence of premature death and the low number of iatrogenic complications suggest that medical and nursing care delivered reflect a good quality of services. The cost was significantly lower than expected, after adjusting for case mix. CONCLUSION: The pilot experience showed that a short-stay hospitalization unit was feasible with acceptable security conditions. The attending physician's knowledge of the patients allowed this system tofocus on essential issues without proposing inappropriate services.
Resumo:
In recent years, there has been an increased attention towards the composition of feeding fats. In the aftermath of the BSE crisis all animal by-products utilised in animal nutrition have been subjected to close scrutiny. Regulation requires that the material belongs to the category of animal by-products fit for human consumption. This implies the use of reliable techniques in order to insure the safety of products. The feasibility of using rapid and non-destructive methods, to control the composition of feedstuffs on animal fats has been studied. Fourier Transform Raman spectroscopy has been chosen for its advantage to give detailed structural information. Data were treated using chemometric methods as PCA and PLS-DA which have permitted to separate well the different classes of animal fats. The same methodology was applied on fats from various types of feedstock and production technology processes. PLS-DA model for the discrimination of animal fats from the other categories presents a sensitivity and a specificity of 0.958 and 0.914, respectively. These results encourage the use of FT-Raman spectroscopy to discriminate animal fats.
Resumo:
BACKGROUND: For free-breathing cardiovascular magnetic resonance (CMR), the self-navigation technique recently emerged, which is expected to deliver high-quality data with a high success rate. The purpose of this study was to test the hypothesis that self-navigated 3D-CMR enables the reliable assessment of cardiovascular anatomy in patients with congenital heart disease (CHD) and to define factors that affect image quality. METHODS: CHD patients ≥2 years-old and referred for CMR for initial assessment or for a follow-up study were included to undergo a free-breathing self-navigated 3D CMR at 1.5T. Performance criteria were: correct description of cardiac segmental anatomy, overall image quality, coronary artery visibility, and reproducibility of great vessels diameter measurements. Factors associated with insufficient image quality were identified using multivariate logistic regression. RESULTS: Self-navigated CMR was performed in 105 patients (55% male, 23 ± 12y). Correct segmental description was achieved in 93% and 96% for observer 1 and 2, respectively. Diagnostic quality was obtained in 90% of examinations, and it increased to 94% if contrast-enhanced. Left anterior descending, circumflex, and right coronary arteries were visualized in 93%, 87% and 98%, respectively. Younger age, higher heart rate, lower ejection fraction, and lack of contrast medium were independently associated with reduced image quality. However, a similar rate of diagnostic image quality was obtained in children and adults. CONCLUSION: In patients with CHD, self-navigated free-breathing CMR provides high-resolution 3D visualization of the heart and great vessels with excellent robustness.
Resumo:
It is axiomatic that our planet is extensively inhabited by diverse micro-organisms such as bacteria, yet the absolute diversity of different bacterial species is widely held to be unknown. Different bacteria can be found from the depths of the oceans to the top of the mountains; even the air is more or less colonized by bacteria. Most bacteria are either harmless or even advantageous to human beings but there are also bacteria, which can cause severe infectious diseases or spoil the supplies intended for human consumption. Therefore, it is vitally important not only to be able to detect and enumerate bacteria but also to assess their viability and possible harmfulness. Whilst the growth of bacteria is remarkably fast under optimum conditions and easy to detect by cultural methods, most bacteria are believed to lie in stationary phase of growth in which the actual growth is ceased and thus bacteria may simply be undetectable by cultural techniques. Additionally, several injurious factors such as low and high temperature or deficiency of nutrients can turn bacteria into a viable but non-culturable state (VBNC) that cannot be detected by cultural methods. Thereby, various noncultural techniques developed for the assessment of bacterial viability and killing have widely been exploited in modern microbiology. However, only a few methods are suitable for kinetic measurements, which enable the real-time detection of bacterial growth and viability. The present study describes alternative methods for measuring bacterial viability and killing as well as detecting the effects of various antimicrobial agents on bacteria on a real-time basis. The suitability of bacterial (lux) and beetle (luc) luciferases as well as green fluorescent protein (GFP) to act as a marker of bacterial viability and cell growth was tested. In particular, a multiparameter microplate assay based on GFP-luciferase combination as well as a flow cytometric measurement based on GFP-PI combination were developed to perform divergent viability analyses. The results obtained suggest that the antimicrobial activities of various drugs against bacteria could be successfully measured using both of these methods. Specifically, the data reliability of flow cytometric viability analysis was notably improved as GFP was utilized in the assay. A fluoro-luminometric microplate assay enabled kinetic measurements, which significantly improved and accelerated the assessment of bacterial viability compared to more conventional viability assays such as plate counting. Moreover, the multiparameter assay made simultaneous detection of GFP fluorescence and luciferase bioluminescence possible and provided extensive information about multiple cellular parameters in single assay, thereby increasing the accuracy of the assessment of the kinetics of antimicrobial activities on target bacteria.
Resumo:
Characterizing the geological features and structures in three dimensions over inaccessible rock cliffs is needed to assess natural hazards such as rockfalls and rockslides and also to perform investigations aimed at mapping geological contacts and building stratigraphy and fold models. Indeed, the detailed 3D data, such as LiDAR point clouds, allow to study accurately the hazard processes and the structure of geologic features, in particular in vertical and overhanging rock slopes. Thus, 3D geological models have a great potential of being applied to a wide range of geological investigations both in research and applied geology projects, such as mines, tunnels and reservoirs. Recent development of ground-based remote sensing techniques (LiDAR, photogrammetry and multispectral / hyperspectral images) are revolutionizing the acquisition of morphological and geological information. As a consequence, there is a great potential for improving the modeling of geological bodies as well as failure mechanisms and stability conditions by integrating detailed remote data. During the past ten years several large rockfall events occurred along important transportation corridors where millions of people travel every year (Switzerland: Gotthard motorway and railway; Canada: Sea to sky highway between Vancouver and Whistler). These events show that there is still a lack of knowledge concerning the detection of potential rockfalls, making mountain residential settlements and roads highly risky. It is necessary to understand the main factors that destabilize rocky outcrops even if inventories are lacking and if no clear morphological evidences of rockfall activity are observed. In order to increase the possibilities of forecasting potential future landslides, it is crucial to understand the evolution of rock slope stability. Defining the areas theoretically most prone to rockfalls can be particularly useful to simulate trajectory profiles and to generate hazard maps, which are the basis for land use planning in mountainous regions. The most important questions to address in order to assess rockfall hazard are: Where are the most probable sources for future rockfalls located? What are the frequencies of occurrence of these rockfalls? I characterized the fracturing patterns in the field and with LiDAR point clouds. Afterwards, I developed a model to compute the failure mechanisms on terrestrial point clouds in order to assess the susceptibility to rockfalls at the cliff scale. Similar procedures were already available to evaluate the susceptibility to rockfalls based on aerial digital elevation models. This new model gives the possibility to detect the most susceptible rockfall sources with unprecented detail in the vertical and overhanging areas. The results of the computation of the most probable rockfall source areas in granitic cliffs of Yosemite Valley and Mont-Blanc massif were then compared to the inventoried rockfall events to validate the calculation methods. Yosemite Valley was chosen as a test area because it has a particularly strong rockfall activity (about one rockfall every week) which leads to a high rockfall hazard. The west face of the Dru was also chosen for the relevant rockfall activity and especially because it was affected by some of the largest rockfalls that occurred in the Alps during the last 10 years. Moreover, both areas were suitable because of their huge vertical and overhanging cliffs that are difficult to study with classical methods. Limit equilibrium models have been applied to several case studies to evaluate the effects of different parameters on the stability of rockslope areas. The impact of the degradation of rockbridges on the stability of large compartments in the west face of the Dru was assessed using finite element modeling. In particular I conducted a back-analysis of the large rockfall event of 2005 (265'000 m3) by integrating field observations of joint conditions, characteristics of fracturing pattern and results of geomechanical tests on the intact rock. These analyses improved our understanding of the factors that influence the stability of rock compartments and were used to define the most probable future rockfall volumes at the Dru. Terrestrial laser scanning point clouds were also successfully employed to perform geological mapping in 3D, using the intensity of the backscattered signal. Another technique to obtain vertical geological maps is combining triangulated TLS mesh with 2D geological maps. At El Capitan (Yosemite Valley) we built a georeferenced vertical map of the main plutonio rocks that was used to investigate the reasons for preferential rockwall retreat rate. Additional efforts to characterize the erosion rate were made at Monte Generoso (Ticino, southern Switzerland) where I attempted to improve the estimation of long term erosion by taking into account also the volumes of the unstable rock compartments. Eventually, the following points summarize the main out puts of my research: The new model to compute the failure mechanisms and the rockfall susceptibility with 3D point clouds allows to define accurately the most probable rockfall source areas at the cliff scale. The analysis of the rockbridges at the Dru shows the potential of integrating detailed measurements of the fractures in geomechanical models of rockmass stability. The correction of the LiDAR intensity signal gives the possibility to classify a point cloud according to the rock type and then use this information to model complex geologic structures. The integration of these results, on rockmass fracturing and composition, with existing methods can improve rockfall hazard assessments and enhance the interpretation of the evolution of steep rockslopes. -- La caractérisation de la géologie en 3D pour des parois rocheuses inaccessibles est une étape nécessaire pour évaluer les dangers naturels tels que chutes de blocs et glissements rocheux, mais aussi pour réaliser des modèles stratigraphiques ou de structures plissées. Les modèles géologiques 3D ont un grand potentiel pour être appliqués dans une vaste gamme de travaux géologiques dans le domaine de la recherche, mais aussi dans des projets appliqués comme les mines, les tunnels ou les réservoirs. Les développements récents des outils de télédétection terrestre (LiDAR, photogrammétrie et imagerie multispectrale / hyperspectrale) sont en train de révolutionner l'acquisition d'informations géomorphologiques et géologiques. Par conséquence, il y a un grand potentiel d'amélioration pour la modélisation d'objets géologiques, ainsi que des mécanismes de rupture et des conditions de stabilité, en intégrant des données détaillées acquises à distance. Pour augmenter les possibilités de prévoir les éboulements futurs, il est fondamental de comprendre l'évolution actuelle de la stabilité des parois rocheuses. Définir les zones qui sont théoriquement plus propices aux chutes de blocs peut être très utile pour simuler les trajectoires de propagation des blocs et pour réaliser des cartes de danger, qui constituent la base de l'aménagement du territoire dans les régions de montagne. Les questions plus importantes à résoudre pour estimer le danger de chutes de blocs sont : Où se situent les sources plus probables pour les chutes de blocs et éboulement futurs ? Avec quelle fréquence vont se produire ces événements ? Donc, j'ai caractérisé les réseaux de fractures sur le terrain et avec des nuages de points LiDAR. Ensuite, j'ai développé un modèle pour calculer les mécanismes de rupture directement sur les nuages de points pour pouvoir évaluer la susceptibilité au déclenchement de chutes de blocs à l'échelle de la paroi. Les zones sources de chutes de blocs les plus probables dans les parois granitiques de la vallée de Yosemite et du massif du Mont-Blanc ont été calculées et ensuite comparés aux inventaires des événements pour vérifier les méthodes. Des modèles d'équilibre limite ont été appliqués à plusieurs cas d'études pour évaluer les effets de différents paramètres sur la stabilité des parois. L'impact de la dégradation des ponts rocheux sur la stabilité de grands compartiments de roche dans la paroi ouest du Petit Dru a été évalué en utilisant la modélisation par éléments finis. En particulier j'ai analysé le grand éboulement de 2005 (265'000 m3), qui a emporté l'entier du pilier sud-ouest. Dans le modèle j'ai intégré des observations des conditions des joints, les caractéristiques du réseau de fractures et les résultats de tests géoméchaniques sur la roche intacte. Ces analyses ont amélioré l'estimation des paramètres qui influencent la stabilité des compartiments rocheux et ont servi pour définir des volumes probables pour des éboulements futurs. Les nuages de points obtenus avec le scanner laser terrestre ont été utilisés avec succès aussi pour produire des cartes géologiques en 3D, en utilisant l'intensité du signal réfléchi. Une autre technique pour obtenir des cartes géologiques des zones verticales consiste à combiner un maillage LiDAR avec une carte géologique en 2D. A El Capitan (Yosemite Valley) nous avons pu géoréferencer une carte verticale des principales roches plutoniques que j'ai utilisé ensuite pour étudier les raisons d'une érosion préférentielle de certaines zones de la paroi. D'autres efforts pour quantifier le taux d'érosion ont été effectués au Monte Generoso (Ticino, Suisse) où j'ai essayé d'améliorer l'estimation de l'érosion au long terme en prenant en compte les volumes des compartiments rocheux instables. L'intégration de ces résultats, sur la fracturation et la composition de l'amas rocheux, avec les méthodes existantes permet d'améliorer la prise en compte de l'aléa chute de pierres et éboulements et augmente les possibilités d'interprétation de l'évolution des parois rocheuses.