906 resultados para management method
Resumo:
Nowadays the used fuel variety in power boilers is widening and new boiler constructions and running models have to be developed. This research and development is done in small pilot plants where more faster analyse about the boiler mass and heat balance is needed to be able to find and do the right decisions already during the test run. The barrier on determining boiler balance during test runs is the long process of chemical analyses of collected input and outputmatter samples. The present work is concentrating on finding a way to determinethe boiler balance without chemical analyses and optimise the test rig to get the best possible accuracy for heat and mass balance of the boiler. The purpose of this work was to create an automatic boiler balance calculation method for 4 MW CFB/BFB pilot boiler of Kvaerner Pulping Oy located in Messukylä in Tampere. The calculation was created in the data management computer of pilot plants automation system. The calculation is made in Microsoft Excel environment, which gives a good base and functions for handling large databases and calculations without any delicate programming. The automation system in pilot plant was reconstructed und updated by Metso Automation Oy during year 2001 and the new system MetsoDNA has good data management properties, which is necessary for big calculations as boiler balance calculation. Two possible methods for calculating boiler balance during test run were found. Either the fuel flow is determined, which is usedto calculate the boiler's mass balance, or the unburned carbon loss is estimated and the mass balance of the boiler is calculated on the basis of boiler's heat balance. Both of the methods have their own weaknesses, so they were constructed parallel in the calculation and the decision of the used method was left to user. User also needs to define the used fuels and some solid mass flowsthat aren't measured automatically by the automation system. With sensitivity analysis was found that the most essential values for accurate boiler balance determination are flue gas oxygen content, the boiler's measured heat output and lower heating value of the fuel. The theoretical part of this work concentrates in the error management of these measurements and analyses and on measurement accuracy and boiler balance calculation in theory. The empirical part of this work concentrates on the creation of the balance calculation for the boiler in issue and on describing the work environment.
Resumo:
Elektroniikka alalla tuotteet sisältävät yhä enemmän ja enemmän komponentteja joiden käyttöä yrityksen tulee hallita. Viime-aikaiset ympäristömääräykset ja lainsäädännöt ovat lisänneet yritysten painetta hallita käyttämiään komponentteja ja niiden tietoa tehokkaasti. Tässä työssä on tutkittu kolmen palveluntarjoajan tarjoamaa komponentinhallinta palvelua verrattunamahdolliseen talon omaan komponentti-insinööriin. Jotta tutkittuja vaihtoehtoja pystyisi vertailemaan, selvitettiin asiantuntija haastatteluja käyttäen komponenttien hallinnan erityispiirteet. Erityispiirteet yhdessä yrityksen vaatimuksien kanssa muodostivat kriteristön johon tutkittuja palveluja vertaillaan. Kriteeristö koostuu kahdeksasta osasta jotka puolestaan voidaan jaotella kolmeen ryhmään niiden keston ja luonteen mukaan. Neljän kriteerin katsottiin olevan tärkeämpiä kuin toiset, joten niille annettiin suurempi painoarvo palveluja vertailtaessa. Kaikki tutkitut palvelut täyttävät osan kriteereistä mutta mikään ei yksistään tarjoa riittävän kattavaa ratkaisua kohdeyrityksen ongelmiin. Suurimmat ongelmat yrityksellä ovat sisäisessä tiedonkulussa ja tietokantojen ja järjestelmien ylläpidossa ja hallinnassa. Jotta nämä ongelmat saataisiin ratkaistua on yrityksen saatava komponenttiprosessit toimimaan sekä tietokanta ajantasalle. Nämä tavoitteet saavutetaan vain jos yrityksessä on joku hoitamassa asiaa sisältä päin. Tutkitut kolme palvelua eivät tällaista sisäistä resurssia tarjoa vaan keskittyvät vain ulkoapäin tapahtuvaan tiedon välitykseen ja hallinnointiin.
Resumo:
Personal results are presented to illustrate the development of immunoscintigraphy for the detection of cancer over the last 12 years, from the early experimental results in nude mice grafted with human colon carcinoma to the most modern form of immunoscintigraphy applied to patients, using I123 labeled Fab fragments from monoclonal anti-CEA antibodies detected by single photon emission computerized tomography (SPECT). The first generation of immunoscintigraphy used I131 labeled, immunoadsorbent purified, polyclonal anti-CEA antibodies and planar scintigraphy, as the detection system. The second generation used I131 labeled monoclonal anti-CEA antibodies and SPECT, while the third generation employed I123 labeled fragments of monoclonal antibodies and SPECT. The improvement in the precision of tumor images with the most recent forms of immunoscintigraphy is obvious. However, we think the usefulness of immunoscintigraphy for routine cancer management has not yet been entirely demonstrated. Further prospective trials are still necessary to determine the precise clinical role of immunoscintigraphy. A case report is presented on a patient with two liver metastases from a sigmoid carcinoma, who received through the hepatic artery a therapeutic dose (100 mCi) of I131 coupled to 40 mg of a mixture of two high affinity anti-CEA monoclonal antibodies. Excellent localisation in the metastases of the I131 labeled antibodies was demonstrated by SPECT and the treatment was well tolerated. The irradiation dose to the tumor, however, was too low at 4300 rads (with 1075 rads to the normal liver and 88 rads to the bone marrow), and no evidence of tumor regression was obtained. Different approaches for increasing the irradiation dose delivered to the tumor by the antibodies are considered.
Resumo:
La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.
Resumo:
Requirements-relatedissues have been found the third most important risk factor in software projects and as the biggest reason for software project failures. This is not a surprise since; requirements engineering (RE) practices have been reported deficient inmore than 75% of all; enterprises. A problem analysis on small and low maturitysoftware organizations revealed two; central reasons for not starting process improvement efforts: lack of resources and uncertainty; about process improvementeffort paybacks.; In the constructive part of the study a basic RE method, BaRE, was developed to provide an; easy to adopt way to introduce basic systematic RE practices in small and low maturity; organizations. Based on diffusion of innovations literature, thirteen desirable characteristics; were identified for the solution and the method was implemented in five key components:; requirements document template, requirements development practices, requirements; management practices, tool support for requirements management, and training.; The empirical evaluation of the BaRE method was conducted in three industrial case studies. In; this evaluation, two companies established a completely new RE infrastructure following the; suggested practices while the third company conducted continued requirements document; template development based on the provided template and used it extensively in practice. The; real benefits of the adoption of the method were visible in the companies in four to six months; from the start of the evaluation project, and the two small companies in the project completed; their improvement efforts with an input equal to about one person month. The collected dataon; the case studies indicates that the companies implemented new practices with little adaptations; and little effort. Thus it can be concluded that the constructed BaRE method is indeed easy to; adopt and it can help introduce basic systematic RE practices in small organizations.
Resumo:
The aim of this study is to provide an instrument for measuring service quality in sports enterprises from the point of view of the customers. For this purpose we intend to elaborate an enquiry starting out from a more general scale called SERVIQUAL. We have limited our research project to sports enterprises where the customer participates actively, i.e., we have excluded sports clubs and other organizations which offer sport as entertainment. Our choice is mainly due to the fact that few studies have been carried out in this area and that sports has been earning an increasing amount of adepts during the last decades in Spain. The DELPHI method has been applied with the collaboration of a panel of experts in order to evaluate the viability and adequacy of the modified SERVQUAL scale.
Resumo:
The application of statistics to science is not a neutral act. Statistical tools have shaped and were also shaped by its objects. In the social sciences, statistical methods fundamentally changed research practice, making statistical inference its centerpiece. At the same time, textbook writers in the social sciences have transformed rivaling statistical systems into an apparently monolithic method that could be used mechanically. The idol of a universal method for scientific inference has been worshipped since the "inference revolution" of the 1950s. Because no such method has ever been found, surrogates have been created, most notably the quest for significant p values. This form of surrogate science fosters delusions and borderline cheating and has done much harm, creating, for one, a flood of irreproducible results. Proponents of the "Bayesian revolution" should be wary of chasing yet another chimera: an apparently universal inference procedure. A better path would be to promote both an understanding of the various devices in the "statistical toolbox" and informed judgment to select among these.
Resumo:
BACKGROUND: Infected postpneumonectomy chest cavities may be related to chronic postpneumonectomy empyema or arise in rare situations of necrotizing pneumonia with complete lung destruction where pneumonectomy and pleural debridement are required. We evaluated the safety and efficacy of an intrathoracic vacuum-assisted closure device (VAC) for the treatment of infected postpneumonectomy chest cavities. METHOD: A retrospective single institution review of all patients with infected postpneumonectomy chest cavities treated by VAC between 2005 and 2013. Patients underwent surgical debridement of the thoracic cavity, muscle flap closure of the bronchial stump when a fistula was present, and repeated intrathoracic VAC dressings until granulation tissue covered the entire chest cavity. After this, the cavity was obliterated by a Clagett procedure and closed. RESULTS: Twenty-one patients (14 men and 7 women) underwent VAC treatment of their infected postpneumonectomy chest cavity. Twelve patients presented with a chronic postpneumonectomy empyema (10 of them with a bronchopleural fistula) and 9 patients with an empyema occurring in the context of necrotizing pneumonia treated by pneumonectomy. In-hospital mortality was 23%. The median duration of VAC therapy was 23 days (range, 4-61 days) and the median number of VAC changes per patient was 6 (range, 2-14 days). Infection control and successful chest cavity closure was achieved in all surviving patients. One adverse VAC treatment-related event was identified (5%). CONCLUSIONS: The intrathoracic VAC application is a safe and efficient treatment of infected postpneumonectomy chest cavities and allows the preservation of chest wall integrity.
Resumo:
Key management has a fundamental role in secure communications. Designing and testing of key management protocols is tricky. These protocols must work flawlessly despite of any abuse. The main objective of this work was to design and implement a tool that helps to specify the protocol and makes it possible to test the protocol while it is still under development. This tool generates compile-ready java code from a key management protocol model. A modelling method for these protocols, which uses Unified Modeling Language (UML) was also developed. The protocol is modelled, exported as an XMI and read by the code generator tool. The code generator generates java code that is immediately executable with a test software after compilation.
Resumo:
To achieve success in a constantly changing environment and with ever-increasing competition, companies must develop their operations continuously. To do this, they must have a clear vision of what they want to be in the future. This vision can be attained through careful planning and strategising. One method of transforming a strategy and vision into an everyday tool used by employees is the use of a balanced performance measurement system. The importance of performance measurement in the implementation of companies' visions and strategies has grown substantially in the last ten years. Measures are derived from the company's critical success factors and from many different perspectives. There are three time dimensions: past, present and future. Many such performance measurement systems have been created since the 1990s. This is a case study whose main objective is to provide a recommendation for how the case company could make use of performance measurement to support strategic management. To answer this question, the study uses literature-based research and empirical research at the case company's premises. The theoretical part of the study consists of two sections: introducing the Balanced Scorecard and discussing how it supports strategic management and change management. The empirical part of this study determines the company's present performance measurement situation through interviews in the company. The study resulted in a recommendation to the company to start developing the Balanced Scorecard system. By setting up this kind process, the company would be able to change its focus more towards the future, beginning to implement a more process-based organisation and getting its employees to work together towards common goals.
Resumo:
Käyttäjien tunnistaminen tietojärjestelmissä on ollut yksi tietoturvan kulmakivistä vuosikymmenten ajan. Ajatus käyttäjätunnuksesta ja salasanasta on kaikkein kustannustehokkain ja käytetyin tapa säilyttää luottamus tietojärjestelmän ja käyttäjien välillä. Tietojärjestelmien käyttöönoton alkuaikoina, jolloin yrityksissä oli vain muutamia tietojärjestelmiä ja niitä käyttivät vain pieni ryhmä käyttäjiä, tämä toimintamalli osoittautui toimivaksi. Vuosien mittaan järjestelmien määrä kasvoi ja sen mukana kasvoi salasanojen määrä ja monimuotoisuus. Kukaan ei osannut ennustaa, kuinka paljon salasanoihin liittyviä ongelmia käyttäjät kohtaisivat ja kuinka paljon ne tulisivat ruuhkauttamaan yritysten käyttäjätukea ja minkälaisia tietoturvariskejä salasanat tulisivat aiheuttamaan suurissa yrityksissä. Tässä diplomityössä tarkastelemme salasanojen aiheuttamia ongelmia suuressa, globaalissa yrityksessä. Ongelmia tarkastellaan neljästä eri näkökulmasta; ihmiset, teknologia, tietoturva ja liiketoiminta. Ongelmat osoitetaan esittelemällä tulokset yrityksen työntekijöille tehdystä kyselystä, joka toteutettiin osana tätä diplomityötä. Ratkaisu näihin ongelmiin esitellään keskitetyn salasanojenhallintajärjestelmän muodossa. Järjestelmän eri ominaisuuksia arvioidaan ja kokeilu -tyyppinen toteutus rakennetaan osoittamaan tällaisen järjestelmän toiminnallisuus.
Resumo:
The objective of the thesis was to create a framework that can be used to define a manufacturing strategy taking advantage of the product life cycle method, which enables PQP enhancements. The starting point was to study synkron implementation of cost leadership and differentiation strategies in different stages of the life cycles. It was soon observed that Porter’s strategies were too generic for the complex and dynamic environment where customer needs deviate market and product specifically. Therefore, the strategy formulation process is based on the Terry Hill’s order-winner and qualifier concepts. The manufacturing strategy formulation is initiated with the definition of order-winning and qualifying criteria. From these criteria there can be shaped product specific proposals for action and production site specific key manufacturing tasks that they need to answer in order to meet customers and markets needs. As a future research it is suggested that the process of capturing order-winners and qualifiers should be developed so that the process would be simple and streamlined at Wallac Oy. In addition, defined strategy process should be integrated to the PerkinElmer’s SGS process. SGS (Strategic Goal Setting) is one of the PerkinElmer’s core management processes. Full Text: Null
Resumo:
Tuotteiden elinkaaret elektroniikkateollisuudessa ovat lyhentyneet entisestään. Paineet uusien kehittyneempien tuotteiden lanseeraamiselle ovat kasvaneet. Huonosti hoidettu tuotevaihto voi aiheuttaa merkittäviä kustannuksia yritykselle mikäli alasajon kustannusvaikutusta tulokseen aliarvioidaan. Työ käsittelee markkinoilta poistuvan tuotteen materiaalivirtojen hallintaa tuotevaihdon aikana. Lähtökohtana työssä oli kehittää toimintatapa miten tuotteen alasajon aikana tulisi toimia sekä mallintaa hankintaketju riskien ja pullonkaulojen löytämiseksi. Päätavoitteena oli materiaalivaraston arvon minimoiminen sekä ylijäämämateriaalien hyödyntäminen. Työn merkittävimpiä tuloksia oli hankintaketjun materiaalivirtojen seurantaan kehitetty Excel työkalu sekä alasajon liittyvien työvaiheiden kuvaaminen.
Resumo:
Työn tavoitteena oli määrittää myyntikonfiguraattorissa käytettävän tuotemallin yleinen rakenne. Ensin selvitettiin tuotemallin luomista ja konseptin suunnittelua kirjallisuuden ja asiantuntijoiden haastattelujen avulla. Asiantuntijoiden haastattelut toteutettiin vapaamuotoisesti kysymyslistaa apuna käyttäen. Tämän lisäksi työssä pohditaan sähköisen liiketoiminnan roolia sekä myyntikonfiguraattorin tulevaisuuden näkymiä. Diplomityössä käsitellään tuotemallia yleisellä tasolla. Toinen näkökulma käsittelee tuotemallia tietoteknisissä sovelluksissa käytettyjen menetelmien pohjalta. Tuotemallin muodostaminen aloitettiin asiakkaalle näkyvästä osasta eli myyntikonfiguraattorin ulkoasusta. Seuraava ongelma oli standardoida tuotetta ja tarjousta kuvaavat dokumentit globaalisti. Tähän ratkaisuun päädyttiin haastattelujen sekä asiantuntijoiden kokoontumisien pohjalta. Loppuosa diplomityöstä käsittelee myyntikonfiguraattorin asemaa kohdeyrityksen sähköisessä liiketoiminnassa sekä esittelee erään näkemyksen myyntikonfiguraattorin yhteenliittymästä asiakashallinta- ja tuotetiedonhallinta järjestelmiin. Diplomityössä saavutettiin asetetut tavoiteet: Myyntikonfigurattori yhtenäistää kohdeyrityksen hinnoittelua globaalisti, nopeuttaa tarjouksentekoprosessia, helpottaa uuden tuotteen lanseerausta ja standardoi tuotemallin globaalisti. Myyntikonfiguraattorin integrointi muihin tietojärjestelmiin tehostaa myynnin toimintoja. Haasteeksi jää loppukäyttäjien kannustaminen tehokkaaseen käyttöön sekä ylläpidon toteuttaminen. Ilman käyttäjiä ja heidän innostustaan voi projekti menettää johdon luottamuksen.
Resumo:
OBJECTIVE: To assess dietary management of cardiovascular risk factors (CVRFs) in the general population. METHOD: Cross-sectional study conducted between 2009 and 2012 on 4811 participants (2567 women, 58±11years) living in Lausanne, Switzerland. RESULTS: Sixteen percent of participants diagnosed with overweight/obesity reported a slimming diet. Slimming diet was associated with diagnosis of hypertension: Odds ratio and (95% confidence interval): 0.61 (0.40-0.93); older age [0.84 (0.58-1.21), 0.79 (0.53-1.18) and 0.47 (0.27-0.81) for [50-60[, [60-70[ and [70+ years, respectively]; female gender [1.84 (1.36-2.48)] and diagnosis of diabetes [2.16 (1.13-4.12)]. Only 8% of participants diagnosed with hypertension reported a low-salt diet. Low-salt diet was associated with antihypertensive drug treatment [2.17 (1.28-3.68)] and diagnosis of diabetes [2.72 (1.26-5.86)]. One-third of participants diagnosed with dyslipidemia reported a low-fat diet. Low-fat diet was associated with female gender [1.47 (1.17-1.86)]; older age [1.29 (0.89-1.87), 1.71 (1.18-2.48) and 2.01 (1.33-3.03) for [50-60[, [60-70[ and [70+ years, respectively]; hypolipidemic drug treatment [OR=1.68 (1.29-2.18)]; current smoking [0.70 (0.51-0.96)] and obesity [0.67 (0.45-1.00)]. Approximately half of participants diagnosed with diabetes reported an antidiabetic diet. Antidiabetic diet was associated with current smoking [0.44 (0.22-0.88)] and antidiabetic drug treatment [OR=3.26 (1.81-5.86)]. CONCLUSION: Dietary management of CVRFs is seldom implemented in Switzerland.