67 resultados para intellectual capital measurement benchmark IT IC indicator

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The objective measurement of dominant/nondominant arm use proportion in daily life may provide relevant information on healthy and pathologic arm behavior. This prospective case-control study explored the potential of such measurements as indicators of upper limb functional recovery after rotator cuff surgery. METHODS: Data on dominant/nondominant arm usage were acquired with body-worn sensors for 7 hours. The postsurgical arm usage of 21 patients was collected at 3, 6, and 12 months after rotator cuff surgery in the sitting, walking, and standing postures and compared with a reference established with 41 healthy subjects. The results were calculated for the dominant and nondominant surgical side subgroups at all stages. The correlations with clinical scores were calculated. RESULTS: Healthy right-handed and left-handed dominant arm usage was 60.2% (±6.3%) and 53.4% (±6.6%), respectively. Differences in use of the dominant side were significant between the right- and left-handed subgroups for sitting (P = .014) and standing (P = .009) but not for walking (P = .328). The patient group showed a significant underuse of 10.7% (±8.9%) at 3 months after surgery (P < .001). The patients recovered normal arm usage within 12 months, regardless of surgical side. The arm underuse measurement was weakly related to function and pain scores. CONCLUSION: This study provided new information on arm recovery after rotator cuff surgery using an innovative measurement method. It highlighted that objective arm underuse measurement is a valuable indicator of upper limb postsurgical outcome that captures a complementary feature to clinical scores.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé La mondialisation des marchés, les mutations du contexte économique et enfin l'impact des nouvelles technologies de l'information ont obligé les entreprises à revoir la façon dont elles gèrent leurs capitaux intellectuel (gestion des connaissances) et humain (gestion des compétences). II est communément admis aujourd'hui que ceux-ci jouent un rôle particulièrement stratégique dans l'organisation. L'entreprise désireuse de se lancer dans une politique gestion de ces capitaux devra faire face à différents problèmes. En effet, afin de gérer ces connaissances et ces compétences, un long processus de capitalisation doit être réalisé. Celui-ci doit passer par différentes étapes comme l'identification, l'extraction et la représentation des connaissances et des compétences. Pour cela, il existe différentes méthodes de gestion des connaissances et des compétences comme MASK, CommonKADS, KOD... Malheureusement, ces différentes méthodes sont très lourdes à mettre en oeuvre, et se cantonnent à certains types de connaissances et sont, par conséquent, plus limitées dans les fonctionnalités qu'elles peuvent offrir. Enfin, la gestion des compétences et la gestion des connaissances sont deux domaines dissociés alors qu'il serait intéressant d'unifier ces deux approches en une seule. En effet, les compétences sont très proches des connaissances comme le souligne la définition de la compétence qui suit : « un ensemble de connaissances en action dans un contexte donné ». Par conséquent, nous avons choisi d'appuyer notre proposition sur le concept de compétence. En effet, la compétence est parmi les connaissances de l'entreprise l'une des plus cruciales, en particulier pour éviter la perte de savoir-faire ou pour pouvoir prévenir les besoins futurs de l'entreprise, car derrière les compétences des collaborateurs, se trouve l'efficacité de l'organisation. De plus, il est possible de décrire grâce à la compétence de nombreux autres concepts de l'organisation, comme les métiers, les missions, les projets, les formations... Malheureusement, il n'existe pas réellement de consensus sur la définition de la compétence. D'ailleurs, les différentes définitions existantes, même si elles sont pleinement satisfaisantes pour les experts, ne permettent pas de réaliser un système opérationnel. Dans notre approche; nous abordons la gestion des compétences à l'aide d'une méthode de gestion des connaissances. En effet, de par leur nature même, connaissance et compétence sont intimement liées et donc une telle méthode est parfaitement adaptée à la gestion des compétences. Afin de pouvoir exploiter ces connaissances et ces compétences nous avons dû, dans un premier temps, définir les concepts organisationnels de façon claire et computationnelle. Sur cette base, nous proposons une méthodologie de construction des différents référentiels d'entreprise (référentiel de compétences, des missions, des métiers...). Pour modéliser ces différents référentiels, nous avons choisi l'ontologie, car elle permet d'obtenir des définitions cohérentes et consensuelles aux concepts tout en supportant les diversités langagières. Ensuite, nous cartographions les connaissances de l'entreprise (formations, missions, métiers...) sur ces différentes ontologies afin de pouvoir les exploiter et les diffuser. Notre approche de la gestion des connaissances et de la gestion des compétences a permis la réalisation d'un outil offrant de nombreuses fonctionnalités comme la gestion des aires de mobilités, l'analyse stratégique, les annuaires ou encore la gestion des CV. Abstract The globalization of markets, the easing of economical regulation and finally the impact of new information and communication technologies have obliged firms to re-examine the way they manage their knowledge capital (knowledge management) and their human capital (competence management). It is commonly admitted that knowledge plays a slightly strategical role in the organization. The firms who want to establish one politic of management of these capitals will have to face with different problems. To manage that knowledge, a long process of capitalization must be done. That one has different steps like identification, extraction and representation of knowledge and competences. There are some different methods of knowledge management like MASK, CommonKADS or KOD. Unfortunately, those methods are very difficult to implement and are using only some types of knowledge and are consequently more limited in the functionalities they can offer. Knowledge management and competence management are two different domain where it could be interesting to unify those to one. Indeed, competence is very close than knowledge as underline this definition: "a set of knowledge in action in a specified context". We choose in our approach to rely on the concept of competence. Indeed, the competence is one of crucial knowledge in the company, particularly to avoid the loss of know-how or to prevent future needs. Because behind collaborator's competence, we can find company efficiency. Unfortunately, there is no real consensus on the definition of the concept of competence. Moreover, existing different definitions don't permit to develop an operational system. Among other key concept, we can find jobs, mission, project, and training... Moreover, we approach different problems of the competence management under the angle of the knowledge management. Indeed, knowledge and competence are closely linked. Then, we propose a method to build different company repositories (competence, jobs, projects repositories). To model those different repositories we choose ontology because it permits to obtain coherent and consensual definitions of the concepts with support of linguistics diversities too. This building repositories method coupled with this knowledge and competence management approach permitted the realization of a tool offering functionalities like mobility management, strategical analysis, yellow pages or CV management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The choice to adopt risk-sensitive measurement approaches for operational risks: the case of Advanced Measurement Approach under Basel II New Capital Accord This paper investigates the choice of the operational risk approach under Basel II requirements and whether the adoption of advanced risk measurement approaches allows banks to save capital. Among the three possible approaches for operational risk measurement, the Advanced Measurement Approach (AMA) is the most sophisticated and requires the use of historical loss data, the application of statistical tools, and the engagement of a highly qualified staff. Our results provide evidence that the adoption of AMA is contingent on the availability of bank resources and prior experience in risk-sensitive operational risk measurement practices. Moreover, banks that choose AMA exhibit low requirements for capital and, as a result might gain a competitive advantage compared to banks that opt for less sophisticated approaches. - Internal Risk Controls and their Impact on Bank Solvency Recent cases in financial sector showed the importance of risk management controls on risk taking and firm performance. Despite advances in the design and implementation of risk management mechanisms, there is little research on their impact on behavior and performance of firms. Based on data from a sample of 88 banks covering the period between 2004 and 2010, we provide evidence that internal risk controls impact the solvency of banks. In addition, our results show that the level of internal risk controls leads to a higher degree of solvency in banks with a major shareholder in contrast to widely-held banks. However, the relationship between internal risk controls and bank solvency is negatively affected by BHC growth strategies and external restrictions on bank activities, while the higher regulatory requirements for bank capital moderates positively this relationship. - The Impact of the Sophistication of Risk Measurement Approaches under Basel II on Bank Holding Companies Value Previous research showed the importance of external regulation on banks' behavior. Some inefficient standards may accentuate risk-taking in banks and provoke a financial crisis. Despite the growing literature on the potential effects of Basel II rules, there is little empirical research on the efficiency of risk-sensitive capital measurement approaches and their impact on bank profitability and market valuation. Based on data from a sample of 66 banks covering the period between 2008 and 2010, we provide evidence that prudential ratios computed under Basel II standards predict the value of banks. However, this relation is contingent on the degree of sophistication of risk measurement approaches that banks apply. Capital ratios are effective in predicting bank market valuation when banks adopt the advanced approaches to compute the value of their risk-weighted assets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The three essays constituting this thesis focus on financing and cash management policy. The first essay aims to shed light on why firms issue debt so conservatively. In particular, it examines the effects of shareholder and creditor protection on capital structure choices. It starts by building a contingent claims model where financing policy results from a trade-off between tax benefits, contracting costs and agency costs. In this setup, controlling shareholders can divert part of the firms' cash ows as private benefits at the expense of minority share- holders. In addition, shareholders as a class can behave strategically at the time of default leading to deviations from the absolute priority rule. The analysis demonstrates that investor protection is a first order determinant of firms' financing choices and that conflicts of interests between firm claimholders may help explain the level and cross-sectional variation of observed leverage ratios. The second essay focuses on the practical relevance of agency conflicts. De- spite the theoretical development of the literature on agency conflicts and firm policy choices, the magnitude of manager-shareholder conflicts is still an open question. This essay proposes a methodology for quantifying these agency conflicts. To do so, it examines the impact of managerial entrenchment on corporate financing decisions. It builds a dynamic contingent claims model in which managers do not act in the best interest of shareholders, but rather pursue private benefits at the expense of shareholders. Managers have discretion over financing and dividend policies. However, shareholders can remove the manager at a cost. The analysis demonstrates that entrenched managers restructure less frequently and issue less debt than optimal for shareholders. I take the model to the data and use observed financing choices to provide firm-specific estimates of the degree of managerial entrenchment. Using structural econometrics, I find costs of control challenges of 2-7% on average (.8-5% at median). The estimates of the agency costs vary with variables that one expects to determine managerial incentives. In addition, these costs are sufficient to resolve the low- and zero-leverage puzzles and explain the time series of observed leverage ratios. Finally, the analysis shows that governance mechanisms significantly affect the value of control and firms' financing decisions. The third essay is concerned with the documented time trend in corporate cash holdings by Bates, Kahle and Stulz (BKS,2003). BKS find that firms' cash holdings double from 10% to 20% over the 1980 to 2005 period. This essay provides an explanation of this phenomenon by examining the effects of product market competition on firms' cash holdings in the presence of financial constraints. It develops a real options model in which cash holdings may be used to cover unexpected operating losses and avoid inefficient closure. The model generates new predictions relating cash holdings to firm and industry characteristics such as the intensity of competition, cash flow volatility, or financing constraints. The empirical examination of the model shows strong support of model's predictions. In addition, it shows that the time trend in cash holdings documented by BKS can be at least partly attributed to a competition effect.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

When researchers introduce a new test they have to demonstrate that it is valid, using unbiased designs and suitable statistical procedures. In this article we use Monte Carlo analyses to highlight how incorrect statistical procedures (i.e., stepwise regression, extreme scores analyses) or ignoring regression assumptions (e.g., heteroscedasticity) contribute to wrong validity estimates. Beyond these demonstrations, and as an example, we re-examined the results reported by Warwick, Nettelbeck, and Ward (2010) concerning the validity of the Ability Emotional Intelligence Measure (AEIM). Warwick et al. used the wrong statistical procedures to conclude that the AEIM was incrementally valid beyond intelligence and personality traits in predicting various outcomes. In our re-analysis, we found that the reliability-corrected multiple correlation of their measures with personality and intelligence was up to .69. Using robust statistical procedures and appropriate controls, we also found that the AEIM did not predict incremental variance in GPA, stress, loneliness, or well-being, demonstrating the importance for testing validity instead of looking for it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: While Switzerland invests a lot of money in its healthcare system, little is known about the quality of care delivered. The objective of this study was to assess the quality of care provided to patients with diabetes in the Canton of Vaud, Switzerland. METHODS: Cross-sectional study of 406 non-institutionalized adults with type 1 or 2 diabetes. Patients' characteristics, diabetes and process of care indicators were collected using a self-administered questionnaire. Process indicators (past 12 months) included HbA1C check among HbA1C-aware patients, eye assessment by ophtalmologist, microalbuminuria check, feet examination, lipid test, blood pressure and weight measurement, influenza immunization, physical activity recommendations, and dietary recommendations. Item-by-item (each process of care indicator: percentage of patients having received it), composite (mean percentage of recommended care: sum of received processes of care / sum of possible recommended care), and all-or-none (percentage of patients receiving all specified recommended care) measures were computed. RESULTS: Mean age was 64.4 years; 59% were men. Type 1 and type 2 diabetes were reported by 18.2% and 68.5% of patients, respectively, but diabetes type remained undetermined for almost 20% of patients. Patients were treated with oral anti-diabetic drugs (50%), insulin (23%) or both (27%). Of 219 HbA1C-aware patients, 98% reported ≥ one HbA1C check during the last year. Also, ≥94% reported ≥ one blood pressure measurement, ≥ one weight measurement or lipid test, and 68%, 64% and 56% had feet examination, microalbuminuria check and eye assessment, respectively. Influenza immunization was reported by 62% of the patients.The percentage of patients receiving all processes of care ranged between 14.2%-16.9%, and 46.6%-50.7%, when considering ten and four indicators, respectively. Ambulatory care utilization showed little use of multidisciplinary care, and low levels of participation in diabetes-education classes. CONCLUSIONS: While routine processes-of-care were performed annually in most patients, diabetes-specific risk screenings, influenza immunization, physical activity and dietary recommendations were less often reported; this was also the case for multidisciplinary care and participation in education classes. There is room for diabetes care improvement in Switzerland. These results should help define priorities and further develop country-specific chronic disease management initiatives for diabetes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT: BACKGROUND: The dissemination of palliative care for patients presenting complex chronic diseases at various stages has become an important matter of public health. A death census in Swiss long-term care facilities (LTC) was set up with the aim of monitoring the frequency of selected indicators of palliative care. METHODS: The survey covered 150 LTC facilities (105 nursing homes and 45 home health services), each of which was asked to complete a questionnaire for every non-accidental death over a period of six months. The frequency of 4 selected indicators of palliative care (resort to a specialized palliative care service, the administration of opiates, use of any pain measurement scale or other symptom measurement scale) was monitored in respect of the stages of care and analysed based on gender, age, medical condition and place of residence. RESULTS: Overall, 1200 deaths were reported, 29.1% of which were related to cancer. The frequencies of each indicator varied according to the type of LTC, mostly regarding the administration of opiate. It appeared that the access to palliative care remained associated with cancer, terminal care and partly with age, whereas gender and the presence of mental disorders had no effect on the indicators. In addition, the use of drugs was much more frequent than the other indicators. CONCLUSION: The profile of patients with access to palliative care must become more diversified. Among other recommendations, equal access to opiates in nursing homes and in home health services, palliative care at an earlier stage and the systematic use of symptom management scales when resorting to opiates have to become of prime concern.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to determine the prognostic accuracy of perfusion computed tomography (CT), performed at the time of emergency room admission, in acute stroke patients. Accuracy was determined by comparison of perfusion CT with delayed magnetic resonance (MR) and by monitoring the evolution of each patient's clinical condition. Twenty-two acute stroke patients underwent perfusion CT covering four contiguous 10mm slices on admission, as well as delayed MR, performed after a median interval of 3 days after emergency room admission. Eight were treated with thrombolytic agents. Infarct size on the admission perfusion CT was compared with that on the delayed diffusion-weighted (DWI)-MR, chosen as the gold standard. Delayed magnetic resonance angiography and perfusion-weighted MR were used to detect recanalization. A potential recuperation ratio, defined as PRR = penumbra size/(penumbra size + infarct size) on the admission perfusion CT, was compared with the evolution in each patient's clinical condition, defined by the National Institutes of Health Stroke Scale (NIHSS). In the 8 cases with arterial recanalization, the size of the cerebral infarct on the delayed DWI-MR was larger than or equal to that of the infarct on the admission perfusion CT, but smaller than or equal to that of the ischemic lesion on the admission perfusion CT; and the observed improvement in the NIHSS correlated with the PRR (correlation coefficient = 0.833). In the 14 cases with persistent arterial occlusion, infarct size on the delayed DWI-MR correlated with ischemic lesion size on the admission perfusion CT (r = 0.958). In all 22 patients, the admission NIHSS correlated with the size of the ischemic area on the admission perfusion CT (r = 0.627). Based on these findings, we conclude that perfusion CT allows the accurate prediction of the final infarct size and the evaluation of clinical prognosis for acute stroke patients at the time of emergency evaluation. It may also provide information about the extent of the penumbra. Perfusion CT could therefore be a valuable tool in the early management of acute stroke patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM: To discuss the use of new ultrasonic techniques that make it possible to visualize elastic (carotid) and muscular (radial) capacitance arteries non-invasively. RESULTS OF DATA REVIEW: Measurements of carotid wall thickness and the detection of atheromas are related to arterial pressure, to other risk factors and to the risk of subsequent complications. The use of high-frequency ultrasound (7.5-10 MHz), measurements of far wall thicknesses in areas free of atheromas at end-diastole (by ECG gating or pressure waveform recording) and descriptions of the size and characteristics of atherosclerotic plaques allow a non-invasive assessment of vascular hypertrophy and atherosclerosis in hypertensive patients. CONCLUSIONS: Careful attention to methodologic and physiologic factors is needed to provide accurate information about the anatomy of the dynamically pulsating arterial tree.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of WP1 of the ORAMED (Optimization of RAdiation protection for MEDical staff) project is to obtain a set of standardised data on extremity and eye lens doses for staff in interventional radiology (IR) and cardiology (IC) and to optimise staff protection. A coordinated measurement program in different hospitals in Europe will help towards this direction. This study aims at analysing the first results of the measurement campaign performed in IR and IC procedures in 34 European hospitals. The highest doses were found for pacemakers, renal angioplasties and embolisations. Left finger and wrist seem to receive the highest extremity doses, while the highest eye lens doses are measured during embolisations. Finally, it was concluded that it is difficult to find a general correlation between kerma area product and extremity or eye lens doses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud computing has recently become very popular, and several bioinformatics applications exist already in that domain. The aim of this article is to analyse a current cloud system with respect to usability, benchmark its performance and compare its user friendliness with a conventional cluster job submission system. Given the current hype on the theme, user expectations are rather high, but current results show that neither the price/performance ratio nor the usage model is very satisfactory for large-scale embarrassingly parallel applications. However, for small to medium scale applications that require CPU time at certain peak times the cloud is a suitable alternative.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerous recent reports by non-governmental organisations (NGOs), academics and international organisations have focused on so-called 'climate refugees'. This article examines the turn from a discourse of 'climate refugees', in which organisations perceive migration as a failure of both mitigation and adaptation to climate change, to one of 'climate migration', in which organisations promote migration as a strategy of adaptation. Its focus is the promotion of climate migration management, and it explores the trend of these discourses through two sections. First, it provides an empirical account of the two discourses, emphasising the differentiation between them. It then focuses on the discourse of climate migration, its origins, extent and content, and the associated practices of 'migration management'. The second part argues that the turn to the promotion of 'climate migration' should be understood as a way to manage the insecurity created by climate change. However, international organisations enacts this management within the forms of neoliberal capitalism, including the framework of governance. Therefore, the promotion of 'climate migration' as a strategy of adaptation to climate change is located within the tendencies of neoliberalism and the reconfiguration of southern states' sovereignty through governance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report on advanced dual-wavelength digital holographic microscopy (DHM) methods, enabling single-acquisition real-time micron-range measurements while maintaining single-wavelength interferometric resolution in the nanometer regime. In top of the unique real-time capability of our technique, it is shown that axial resolution can be further increased compared to single-wavelength operation thanks to the uncorrelated nature of both recorded wavefronts. It is experimentally demonstrated that DHM topographic investigation within 3 decades measurement range can be achieved with our arrangement, opening new applications possibilities for this interferometric technique. ©2008 COPYRIGHT SPIE