33 resultados para Building permits
em Université de Lausanne, Switzerland
Resumo:
Information and Communication Technologies provide public administrations new ways to meet their users' needs. At the same time, e-Government practices support the public sector in improving the quality of service provision and of its internal operations. In this paper we discuss the impacts of digitization on the management of administrative procedures. The theoretical framework and the research model that we will use in this study help us tackle the question of how digitization transforms administrative procedures as, for example, in terms of time and roles. The multiplicity of institutions involved in issuing building permits led us to consider this administrative procedure as a very interesting case study. An online survey was first addressed to Swiss civil servants to explore the field, and here we present some of its results. We are currently undertaking an in-depth case study of the building permit procedures in three Swiss Cantons, which we will also present in this paper. We will conclude with a discussion and the future steps of this project.
Resumo:
Résumé La mondialisation des marchés, les mutations du contexte économique et enfin l'impact des nouvelles technologies de l'information ont obligé les entreprises à revoir la façon dont elles gèrent leurs capitaux intellectuel (gestion des connaissances) et humain (gestion des compétences). II est communément admis aujourd'hui que ceux-ci jouent un rôle particulièrement stratégique dans l'organisation. L'entreprise désireuse de se lancer dans une politique gestion de ces capitaux devra faire face à différents problèmes. En effet, afin de gérer ces connaissances et ces compétences, un long processus de capitalisation doit être réalisé. Celui-ci doit passer par différentes étapes comme l'identification, l'extraction et la représentation des connaissances et des compétences. Pour cela, il existe différentes méthodes de gestion des connaissances et des compétences comme MASK, CommonKADS, KOD... Malheureusement, ces différentes méthodes sont très lourdes à mettre en oeuvre, et se cantonnent à certains types de connaissances et sont, par conséquent, plus limitées dans les fonctionnalités qu'elles peuvent offrir. Enfin, la gestion des compétences et la gestion des connaissances sont deux domaines dissociés alors qu'il serait intéressant d'unifier ces deux approches en une seule. En effet, les compétences sont très proches des connaissances comme le souligne la définition de la compétence qui suit : « un ensemble de connaissances en action dans un contexte donné ». Par conséquent, nous avons choisi d'appuyer notre proposition sur le concept de compétence. En effet, la compétence est parmi les connaissances de l'entreprise l'une des plus cruciales, en particulier pour éviter la perte de savoir-faire ou pour pouvoir prévenir les besoins futurs de l'entreprise, car derrière les compétences des collaborateurs, se trouve l'efficacité de l'organisation. De plus, il est possible de décrire grâce à la compétence de nombreux autres concepts de l'organisation, comme les métiers, les missions, les projets, les formations... Malheureusement, il n'existe pas réellement de consensus sur la définition de la compétence. D'ailleurs, les différentes définitions existantes, même si elles sont pleinement satisfaisantes pour les experts, ne permettent pas de réaliser un système opérationnel. Dans notre approche; nous abordons la gestion des compétences à l'aide d'une méthode de gestion des connaissances. En effet, de par leur nature même, connaissance et compétence sont intimement liées et donc une telle méthode est parfaitement adaptée à la gestion des compétences. Afin de pouvoir exploiter ces connaissances et ces compétences nous avons dû, dans un premier temps, définir les concepts organisationnels de façon claire et computationnelle. Sur cette base, nous proposons une méthodologie de construction des différents référentiels d'entreprise (référentiel de compétences, des missions, des métiers...). Pour modéliser ces différents référentiels, nous avons choisi l'ontologie, car elle permet d'obtenir des définitions cohérentes et consensuelles aux concepts tout en supportant les diversités langagières. Ensuite, nous cartographions les connaissances de l'entreprise (formations, missions, métiers...) sur ces différentes ontologies afin de pouvoir les exploiter et les diffuser. Notre approche de la gestion des connaissances et de la gestion des compétences a permis la réalisation d'un outil offrant de nombreuses fonctionnalités comme la gestion des aires de mobilités, l'analyse stratégique, les annuaires ou encore la gestion des CV. Abstract The globalization of markets, the easing of economical regulation and finally the impact of new information and communication technologies have obliged firms to re-examine the way they manage their knowledge capital (knowledge management) and their human capital (competence management). It is commonly admitted that knowledge plays a slightly strategical role in the organization. The firms who want to establish one politic of management of these capitals will have to face with different problems. To manage that knowledge, a long process of capitalization must be done. That one has different steps like identification, extraction and representation of knowledge and competences. There are some different methods of knowledge management like MASK, CommonKADS or KOD. Unfortunately, those methods are very difficult to implement and are using only some types of knowledge and are consequently more limited in the functionalities they can offer. Knowledge management and competence management are two different domain where it could be interesting to unify those to one. Indeed, competence is very close than knowledge as underline this definition: "a set of knowledge in action in a specified context". We choose in our approach to rely on the concept of competence. Indeed, the competence is one of crucial knowledge in the company, particularly to avoid the loss of know-how or to prevent future needs. Because behind collaborator's competence, we can find company efficiency. Unfortunately, there is no real consensus on the definition of the concept of competence. Moreover, existing different definitions don't permit to develop an operational system. Among other key concept, we can find jobs, mission, project, and training... Moreover, we approach different problems of the competence management under the angle of the knowledge management. Indeed, knowledge and competence are closely linked. Then, we propose a method to build different company repositories (competence, jobs, projects repositories). To model those different repositories we choose ontology because it permits to obtain coherent and consensual definitions of the concepts with support of linguistics diversities too. This building repositories method coupled with this knowledge and competence management approach permitted the realization of a tool offering functionalities like mobility management, strategical analysis, yellow pages or CV management.
Resumo:
Species distribution models (SDMs) are widely used to explain and predict species ranges and environmental niches. They are most commonly constructed by inferring species' occurrence-environment relationships using statistical and machine-learning methods. The variety of methods that can be used to construct SDMs (e.g. generalized linear/additive models, tree-based models, maximum entropy, etc.), and the variety of ways that such models can be implemented, permits substantial flexibility in SDM complexity. Building models with an appropriate amount of complexity for the study objectives is critical for robust inference. We characterize complexity as the shape of the inferred occurrence-environment relationships and the number of parameters used to describe them, and search for insights into whether additional complexity is informative or superfluous. By building 'under fit' models, having insufficient flexibility to describe observed occurrence-environment relationships, we risk misunderstanding the factors shaping species distributions. By building 'over fit' models, with excessive flexibility, we risk inadvertently ascribing pattern to noise or building opaque models. However, model selection can be challenging, especially when comparing models constructed under different modeling approaches. Here we argue for a more pragmatic approach: researchers should constrain the complexity of their models based on study objective, attributes of the data, and an understanding of how these interact with the underlying biological processes. We discuss guidelines for balancing under fitting with over fitting and consequently how complexity affects decisions made during model building. Although some generalities are possible, our discussion reflects differences in opinions that favor simpler versus more complex models. We conclude that combining insights from both simple and complex SDM building approaches best advances our knowledge of current and future species ranges.
Resumo:
NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4-year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard-setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: • the potential toxic and safety hazards of nanomaterials throughout their lifecycles; • the fate and persistence of nanoparticles in humans, animals and the environment; • the associated risks of nanoparticle exposure; • greater participation in: the preparation of nomenclature, standards, methodologies, protocols and benchmarks; • the development of best practice guidelines; • voluntary schemes on responsibility; • databases of materials, research topics and themes, but also of expertise. These findings suggested that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Subsequently a workshop was organised by NIN focused on building a sustainable multi-stakeholder dialogue. Specific questions were asked to different stakeholder groups to encourage discussions and open communication. 1. What information do stakeholders need from researchers and why? The discussions about this question confirmed the needs identified in the targeted phone calls. 2. How to communicate information? While it was agreed that reporting should be enhanced, commercial confidentiality and economic competition were identified as major obstacles. It was recognised that expertise was needed in the areas of commercial law and economics for a wellinformed treatment of this communication issue. 3. Can engineered nanomaterials be used safely? The idea that nanomaterials are probably safe because some of them have been produced 'for a long time', was questioned, since many materials in common use have been proved to be unsafe. The question of safety is also about whether the public has confidence. New legislation like REACH could help with this issue. Hazards do not materialise if exposure can be avoided or at least significantly reduced. Thus, there is a need for information on what can be regarded as acceptable levels of exposure. Finally, it was noted that there is no such thing as a perfectly safe material but only boundaries. At this moment we do not know where these boundaries lie. The matter of labelling of products containing nanomaterials was raised, as in the public mind safety and labelling are connected. This may need to be addressed since the issue of nanomaterials in food, drink and food packaging may be the first safety issue to attract public and media attention, and this may have an impact on 'nanotechnology as a whole. 4. Do we need more or other regulation? Any decision making process should accommodate the changing level of uncertainty. To address the uncertainties, adaptations of frameworks such as REACH may be indicated for nanomaterials. Regulation is often needed even if voluntary measures are welcome because it mitigates the effects of competition between industries. Data cannot be collected on voluntary bases for example. NIN will continue with an active stakeholder dialogue to further build on interdisciplinary relationships towards a healthy future with nanotechnology.
Resumo:
Introduction: Exposure to environmental tobacco smoke (ETS) is a major environmental risk factor. Indoor contaminants come from a variety of sources, which can include inadequate ventilation, volatile organic compounds (VOCs), biological agents, combustion products, and ETS. Because ETS is one of the most frequent causes of IAQ complaints as well as the high mortality of passive smoking, in June 2004 the University of Geneva made the decision to ban smoking inside the so called "Uni-Mail" building, the biggest Swiss University human science building of recent construction, and the ordinance was applied beginning in October 2004. This report presents the finding related to the IAQ of the "Uni-Mail" building before and after smoking bans using nicotine, suspended dust, condensate and PAHs level in air as tracers to perform an assessment of passive tobacco exposure for non-smokers inside the building. Methods: Respirable particles (RSP) A real time aerosol monitor (model DataRAM)was place at sampling post 1, level ground floor. Condensate It consists in extracting any organic matter taken on the glass fibre filters by MeOH, and then measuring the total absorbent of the MeOH extract to the UV wavelength of 447 nm. Nicotine Nicotine was taken by means of cartridges containing of XAD-4 to the fixed flow of 0.5 L/min. The analytical method used for the determination of nicotine is based on gas chromatography with Nitrogen selective detector GC-NPD. Results: Figure 1 shows the box plot density display of 3 parameters before and after smoking bans for all 7 sampling posts: dust, condensate and nicotine in air in μg/m3. Conclusion: Before the smoking ban, the level of the concentrations of respirable particles (RSP) is raised more, average of the day 320 μg/m3, with peaks of more than 1000 μg/m3, compared with the values of the surrounding air between 22 and 30 μg/m3. The nicotine level is definitely more important (average 5.53 μg/m3, field 1.5 to 17.9 μg/m3). Once the smoking bans inside the building were applied, one notes a clear improvement in terms of concentrations of pollutants. For dust, the concentration fell by 3 times (average: 130 μg/m3, range: 40 to 160 μg/m3) and that of nicotine by 10 times (average: 0.53 μg/m3, range: 0 to 1.69 μg/m3) compared to that found before smoking bans. The outdoor air RSP concentration was 22 μg/m3 or 10 times lower. Nicotine seems to be the best tracer for ETS free of interference, independent of location or season.
Resumo:
Excessive exposure to solar UV light is the main cause of skin cancers in humans. UV exposure depends on environmental as well as individual factors related to activity. Although outdoor occupational activities contribute significantly to the individual dose received, data on effective exposure are scarce and limited to a few occupations. A study was undertaken in order to assess effective short-term exposure among building workers and characterize the influence of individual and local factors on exposure. The effective exposure of construction workers in a mountainous area in the southern part of Switzerland was investigated through short-term dosimetry (97 dosimeters). Three altitudes, of about 500, 1500 and 2500 m were considered. Individual measurements over 20 working periods were performed using Spore film dosimeters on five body locations. The postural activity of workers was concomitantly recorded and static UV measurements were also performed. Effective exposure among building workers was high and exceeded occupational recommendations, for all individuals for at least one body location. The mean daily UV dose in plain was 11.9 SED (0.0-31.3 SED), in middle mountain 21.4 SED (6.6-46.8 SED) and in high mountain 28.6 SED (0.0-91.1 SED). Measured doses between workers and anatomical locations exhibited a high variability, stressing the role of local exposure conditions and individual factors. Short-term effective exposure ranged between 0 and 200% of ambient irradiation, indicating the occurrence of intense, subacute exposures. A predictive irradiation model was developed to investigate the role of individual factors. Posture and orientation were found to account for at least 38% of the total variance of relative individual exposure, and were also found to account more than altitude on the total variance of effective daily exposures. Targeted sensitization actions through professional information channels and specific prevention messages are recommended. Altitude outdoor workers should also benefit from preventive medical examination.
Resumo:
In European countries and North America, people spend 80 to 90% of time inside buildings and thus breathe indoor air. In Switzerland, special attention has been devoted to the 16 stations of the national network of observation of atmospheric pollutants (NABEL). The results indicate a reduction in outdoor pollution over the last ten years. With such a decrease in pollution over these ten years the question becomes: how can we explain an increase of diseases? Indoor pollution can be the cause. Indoor contaminants that may create indoor air quality (IAQ) problems come from a variety of sources. These can include inadequate ventilation, temperature and humidity dysfunction, and volatile organic compounds (VOCs). The health effects from these contaminants are varied and can range from discomfort, irritation and respiratory diseases to cancer. Among such contaminants, environmental tobacco smoke (ETS) could be considered the most important in terms of both health effects and engineering controls of ventilation. To perform indoor pollution monitoring, several selected ETS tracers can be used including carbon monoxide (CO), carbon dioxide (CO2), respirable particles (RSP), condensate, nicotine, polycyclic aromatic hydrocarbons (PAHs), nitrosamines, etc. In this paper, some examples are presented of IAQ problems that have occurred following the renewal of buildings and energy saving concerns. Using industrial hygiene sampling techniques and focussing on selected priority pollutants used as tracers, various problems have been identified and solutions proposed. [Author]
Resumo:
The development of forensic intelligence relies on the expression of suitable models that better represent the contribution of forensic intelligence in relation to the criminal justice system, policing and security. Such models assist in comparing and evaluating methods and new technologies, provide transparency and foster the development of new applications. Interestingly, strong similarities between two separate projects focusing on specific forensic science areas were recently observed. These observations have led to the induction of a general model (Part I) that could guide the use of any forensic science case data in an intelligence perspective. The present article builds upon this general approach by focusing on decisional and organisational issues. The article investigates the comparison process and evaluation system that lay at the heart of the forensic intelligence framework, advocating scientific decision criteria and a structured but flexible and dynamic architecture. These building blocks are crucial and clearly lay within the expertise of forensic scientists. However, it is only part of the problem. Forensic intelligence includes other blocks with their respective interactions, decision points and tensions (e.g. regarding how to guide detection and how to integrate forensic information with other information). Formalising these blocks identifies many questions and potential answers. Addressing these questions is essential for the progress of the discipline. Such a process requires clarifying the role and place of the forensic scientist within the whole process and their relationship to other stakeholders.