845 resultados para declarative, procedural, and reflective (DPR) model
Resumo:
BACKGROUND: Workers with persistent disabilities after orthopaedic trauma may need occupational rehabilitation. Despite various risk profiles for non-return-to-work (non-RTW), there is no available predictive model. Moreover, injured workers may have various origins (immigrant workers), which may either affect their return to work or their eligibility for research purposes. The aim of this study was to develop and validate a predictive model that estimates the likelihood of non-RTW after occupational rehabilitation using predictors which do not rely on the worker's background. METHODS: Prospective cohort study (3177 participants, native (51%) and immigrant workers (49%)) with two samples: a) Development sample with patients from 2004 to 2007 with Full and Reduced Models, b) External validation of the Reduced Model with patients from 2008 to March 2010. We collected patients' data and biopsychosocial complexity with an observer rated interview (INTERMED). Non-RTW was assessed two years after discharge from the rehabilitation. Discrimination was assessed by the area under the receiver operating curve (AUC) and calibration was evaluated with a calibration plot. The model was reduced with random forests. RESULTS: At 2 years, the non-RTW status was known for 2462 patients (77.5% of the total sample). The prevalence of non-RTW was 50%. The full model (36 items) and the reduced model (19 items) had acceptable discrimination performance (AUC 0.75, 95% CI 0.72 to 0.78 and 0.74, 95% CI 0.71 to 0.76, respectively) and good calibration. For the validation model, the discrimination performance was acceptable (AUC 0.73; 95% CI 0.70 to 0.77) and calibration was also adequate. CONCLUSIONS: Non-RTW may be predicted with a simple model constructed with variables independent of the patient's education and language fluency. This model is useful for all kinds of trauma in order to adjust for case mix and it is applicable to vulnerable populations like immigrant workers.
Resumo:
Background: The 1st Swiss federal Transplant Law was finally enforced in July 2007 with the obligation to promote quality and efficiency in transplant procedures. The LODP was created to develop organ and tissue donation in the Latin area of Switzerland covering seventeen hospitals (29% of the population).Methods: Each of the partner hospitals designated at least one Local Donor Coordinator (LDC), member of the Intensive Care team, trained in the organ donation (OD) process. The principal tasks of the LDC's are the introduction of OD procedures, organisation of educational sessions for hospital staff and execution of the Donor Action programme. The LODP has been operational since July 2009, when training of the LDC's was completed, the web-site and hotline activated and the attendance of Transplant Procurement Coordinators (TPC) during the OD process organised.Results: National and regional guidelines are accessible on the LODP website. The Hospital Attitude Survey obtained a 57% return rate. Many of the staff requested training and sessions are now running in the partner hospitals. The Medical Record Revue revealed an increase in the conversion rate from 3.5% to 4.5%. During the 5 years before creation of LODP the average annual number of utilised donors was 31, an increase of 70%, has since been observed.Conclusion: This clear progression in utilised donors in the past two years can be attributed to the fact that partner hospitals benefit from the various support given (hotline, website and from TPC's). Despite the increase in OD within the LODP the Swiss donation rates remain low, on average 11.9 donors per million population. This successful model should be applied throughout Switzerland, but the crucial point is to obtain financial support.
Resumo:
Tutkimuksen tavoite oli selvittää yrityksen web toiminnan rakentamisen vaiheita sekä menestyksen mittaamista. Rakennusprosessia tutkittiin viisiportaisen askelmallin avulla. Mallin askeleet ovat; arviointi, strategian muotoilu, suunnitelma, pohjapiirros ja toteutus. Arviointi- ja toteutusvaiheiden täydentämiseksi sekä erityisesti myös internet toiminnan onnistumisen mittaamisen avuksi internet toiminnan hyödyt (CRM,kommunikointi-, myynti-, ja jakelukanava hyödyt markkinoinnin kannalta) käsiteltiin. Toiminnan menestyksen arvioinnin avuksi esiteltiin myös porrasmalli internet toimintaan. Porrasmalli määrittelee kauppakulissi-, dynaaminen-, transaktio- ja e-businessportaat. Tutkimuksessa löydettiin menestystekijöitä internet toimintojen menestykselle. Nämä tekijät ovat laadukas sisältö, kiinnostavuus, viihdyttävyys, informatiivisuus, ajankohtaisuus, personoitavuus, luottamus, interaktiivisuus, käytettävyys, kätevyys, lojaalisuus, suoriutuminen, responssiivisuus ja käyttäjätiedon kerääminen. Mittarit jaettiin tutkimuksessa aktiivisuus-, käyttäytymis- ja muunnosmittareihin. Lisäksi muita mittareita ja menestysindikaattoreita esiteltiin. Nämä menestyksen elementit ja mittarit koottiin yhteen uudessa internet toimintojen menestyksenarviointimallissa. Tutkielman empiirisessä osuudessa,esitettyjä teorioita peilattiin ABB:n (ABB:n sisällä erityisesti ABB Stotz-Kontakt) web toimintaan. Apuna olivat dokumenttianalyysi sekä haastattelut. Empiirinen osa havainnollisti teoriat käytännössä ja toi ilmi mahdollisuuden teorioiden laajentamiseen. Internet toimintojen rakentamismallia voidaan käyttää myös web toimintojen kehittämiseen ja porrasmalli sopii myös nykyisten internet toimintojen arvioimiseen. Mittareiden soveltaminen käytännössä toi kuitenkin ilmi tarpeen niiden kehittämiseen ja aiheen lisätutkimukseen. Niiden tulisi olla myös aiempaatiiviimmin liitetty kokonaisvaltaisen liiketoiminnan menestyksen mittaamiseen.
Resumo:
The goal of this dissertation is to find and provide the basis for a managerial tool that allows a firm to easily express its business logic. The methodological basis for this work is design science, where the researcher builds an artifact to solve a specific problem. In this case the aim is to provide an ontology that makes it possible to explicit a firm's business model. In other words, the proposed artifact helps a firm to formally describe its value proposition, its customers, the relationship with them, the necessary intra- and inter-firm infrastructure and its profit model. Such an ontology is relevant because until now there is no model that expresses a company's global business logic from a pure business point of view. Previous models essentially take an organizational or process perspective or cover only parts of a firm's business logic. The four main pillars of the ontology, which are inspired by management science and enterprise- and processmodeling, are product, customer interface, infrastructure and finance. The ontology is validated by case studies, a panel of experts and managers. The dissertation also provides a software prototype to capture a company's business model in an information system. The last part of the thesis consists of a demonstration of the value of the ontology in business strategy and Information Systems (IS) alignment. Structure of this thesis: The dissertation is structured in nine parts: Chapter 1 presents the motivations of this research, the research methodology with which the goals shall be achieved and why this dissertation present a contribution to research. Chapter 2 investigates the origins, the term and the concept of business models. It defines what is meant by business models in this dissertation and how they are situated in the context of the firm. In addition this chapter outlines the possible uses of the business model concept. Chapter 3 gives an overview of the research done in the field of business models and enterprise ontologies. Chapter 4 introduces the major contribution of this dissertation: the business model ontology. In this part of the thesis the elements, attributes and relationships of the ontology are explained and described in detail. Chapter 5 presents a case study of the Montreux Jazz Festival which's business model was captured by applying the structure and concepts of the ontology. In fact, it gives an impression of how a business model description based on the ontology looks like. Chapter 6 shows an instantiation of the ontology into a prototype tool: the Business Model Modelling Language BM2L. This is an XML-based description language that allows to capture and describe the business model of a firm and has a large potential for further applications. Chapter 7 is about the evaluation of the business model ontology. The evaluation builds on literature review, a set of interviews with practitioners and case studies. Chapter 8 gives an outlook on possible future research and applications of the business model ontology. The main areas of interest are alignment of business and information technology IT/information systems IS and business model comparison. Finally, chapter 9 presents some conclusions.
Resumo:
La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.
Resumo:
Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.
Resumo:
Environmentally harmful consequences of fossil fuel utilisation andthe landfilling of wastes have increased the interest among the energy producers to consider the use of alternative fuels like wood fuels and Refuse-Derived Fuels, RDFs. The fluidised bed technology that allows the flexible use of a variety of different fuels is commonly used at small- and medium-sized power plants ofmunicipalities and industry in Finland. Since there is only one mass-burn plantcurrently in operation in the country and no intention to build new ones, the co-firing of pre-processed wastes in fluidised bed boilers has become the most generally applied waste-to-energy concept in Finland. The recently validated EU Directive on Incineration of Wastes aims to mitigate environmentally harmful pollutants of waste incineration and co-incineration of wastes with conventional fuels. Apart from gaseous flue gas pollutants and dust, the emissions of toxic tracemetals are limited. The implementation of the Directive's restrictions in the Finnish legislation is assumed to limit the co-firing of waste fuels, due to the insufficient reduction of the regulated air pollutants in the existing flue gas cleaning devices. Trace metals emission formation and reduction in the ESP, the condensing wet scrubber, the fabric filter, and the humidification reactor were studied, experimentally, in full- and pilot-scale combustors utilising the bubbling fluidised bed technology, and, theoretically, by means of reactor model calculations. The core of the model is a thermodynamic equilibrium analysis. The experiments were carried out with wood chips, sawdust, and peat, and their refuse-derived fuel, RDF, blends. In all, ten different fuels or fuel blends were tested. Relatively high concentrations of trace metals in RDFs compared to the concentrations of these metals in wood fuels increased the trace metal concentrations in the flue gas after the boiler ten- to hundred-folds, when RDF was co-fired with sawdust in a full-scale BFB boiler. In the case of peat, lesser increase in trace metal concentrations was observed, due to the higher initial trace metal concentrations of peat compared to sawdust. Despite the high removal rate of most of the trace metals in the ESP, the Directive emission limits for trace metals were exceeded in each of the RDF co-firing tests. The dominat trace metals in fluegas after the ESP were Cu, Pb and Mn. In the condensing wet scrubber, the flue gas trace metal emissions were reduced below the Directive emission limits, whenRDF pellet was used as a co-firing fuel together with sawdust and peat. High chlorine content of the RDFs enhanced the mercuric chloride formation and hence the mercury removal in the ESP and scrubber. Mercury emissions were lower than theDirective emission limit for total Hg, 0.05 mg/Nm3, in all full-scale co-firingtests already in the flue gas after the ESP. The pilot-scale experiments with aBFB combustor equipped with a fabric filter revealed that the fabric filter alone is able to reduce the trace metal concentrations, including mercury, in the flue gas during the RDF co-firing approximately to the same level as they are during the wood chip firing. Lower trace metal emissions than the Directive limits were easily reached even with a 40% thermal share of RDF co-firing with sawdust.Enrichment of trace metals in the submicron fly ash particle fraction because of RDF co-firing was not observed in the test runs where sawdust was used as the main fuel. The combustion of RDF pellets with peat caused an enrichment of As, Cd, Co, Pb, Sb, and V in the submicron particle mode. Accumulation and release oftrace metals in the bed material was examined by means of a bed material analysis, mass balance calculations and a reactor model. Lead, zinc and copper were found to have a tendency to be accumulated in the bed material but also to have a tendency to be released from the bed material into the combustion gases, if the combustion conditions were changed. The concentration of the trace metal in the combustion gases of the bubbling fluidised bed boiler was found to be a summary of trace metal fluxes from three main sources. They were (1) the trace metal flux from the burning fuel particle (2) the trace metal flux from the ash in the bed, and (3) the trace metal flux from the active alkali metal layer on the sand (and ash) particles in the bed. The amount of chlorine in the system, the combustion temperature, the fuel ash composition and the saturation state of the bed material in regard to trace metals were discovered to be key factors affecting therelease process. During the co-firing of waste fuels with variable amounts of e.g. ash and chlorine, it is extremely important to consider the possible ongoingaccumulation and/or release of the trace metals in the bed, when determining the flue gas trace metal emissions. If the state of the combustion process in regard to trace metals accumulation and/or release in the bed material is not known,it may happen that emissions from the bed material rather than the combustion of the fuel in question are measured and reported.
Resumo:
Poly (ADP-ribose) polymerase 1 (PARP-1) is a constitutive enzyme, the major isoform of the PARP family, which is involved in the regulation of DNA repair, cell death, metabolism, and inflammatory responses. Pharmacological inhibitors of PARP provide significant therapeutic benefits in various preclinical disease models associated with tissue injury and inflammation. However, our understanding the role of PARP activation in the pathophysiology of liver inflammation and fibrosis is limited. In this study we investigated the role of PARP-1 in liver inflammation and fibrosis using acute and chronic models of carbon tetrachloride (CCl4 )-induced liver injury and fibrosis, a model of bile duct ligation (BDL)-induced hepatic fibrosis in vivo, and isolated liver-derived cells ex vivo. Pharmacological inhibition of PARP with structurally distinct inhibitors or genetic deletion of PARP-1 markedly attenuated CCl4 -induced hepatocyte death, inflammation, and fibrosis. Interestingly, the chronic CCl4 -induced liver injury was also characterized by mitochondrial dysfunction and dysregulation of numerous genes involved in metabolism. Most of these pathological changes were attenuated by PARP inhibitors. PARP inhibition not only prevented CCl4 -induced chronic liver inflammation and fibrosis, but was also able to reverse these pathological processes. PARP inhibitors also attenuated the development of BDL-induced hepatic fibrosis in mice. In liver biopsies of subjects with alcoholic or hepatitis B-induced cirrhosis, increased nitrative stress and PARP activation was noted. CONCLUSION: The reactive oxygen/nitrogen species-PARP pathway plays a pathogenetic role in the development of liver inflammation, metabolism, and fibrosis. PARP inhibitors are currently in clinical trials for oncological indications, and the current results indicate that liver inflammation and liver fibrosis may be additional clinical indications where PARP inhibition may be of translational potential.
Resumo:
The impact of personality and job characteristics on parental rearing styles was compared in 353 employees. Hypotheses concerning the relationships between personality and job variables were formulated in accordance with findings in past research and the Belsky’s model (1984). Structural equation nested models showed that Aggression-hostility, Sociability and job Demand were predictive of Rejection and Emotional Warmth parenting styles, providing support for some of the hypothesized relationships. The findings suggest a well-balanced association of personality variables with both parenting styles: Aggression-Hostility was positively related to Rejection and negatively to Emotional Warmth, whereas Sociability was positively related to Emotional Warmth and negatively related to Rejection. Personality dimensions explained a higher amount of variance in observed parenting styles. However, a model that considered both, personality and job dimensions as antecedent variables of parenting was the best representation of observed data, as both systems play a role in the prediction of parenting behavior.
Resumo:
This paper describes the main features and present results of MPRO-Spanish, a parser for morphological and syntactic analysis of unrestricted Spanish text developed at the IAI1. This parser makes direct use of X-phrase structure rules to handle a variety of patterns from derivational morphology and syntactic structure. Both analyses, morphological and syntactic, are realised by two subsequent modules. One module analyses and disambiguates the source words at morphological level while the other consists of a series of programs and a deterministic, procedural and explicit grammar. The article explains the main features of MPRO and resumes some of the experiments on some of its applications, some of which still being implemented like the monolingual and bilingual term extraction while others need further work like indexing. The results and applications obtained so far with simple and relatively complex sentences give us grounds to believe in its reliability.
Resumo:
The use of body percussion through BAPNE method in neurorehabilitation offers the possibility of studying the development of motor skills, attention, coordination, memory and social interaction of patients with neurological diseases. The experimental protocol was carried out on 52 patients with severe acquired brain injury. Patients were selected for the cut - off scores in the standard neuropsychologic al tests of sustained attention , divided and alert ; at least one emisoma intact, cut -off scores in the standard for procedural and semantic memory ; eye sight , hearing and speech intact. The first group of patients has supported the protocol BAPNE tougher with the traditional rehabilitation activities . The control group continued to perform exclusively the cognitive and neuromotor rehabilitation according to traditional protocols. At 6 months after administration of the protocol is expected to re-test to assess if present , the maintenance of the effects of rehabilitation obtained. Experimentation is carried out for 10 weeks following the protocol of BAPNE method in the Roboris Foundation of Rome.
Resumo:
Formation of nanosized droplets/bubbles from a metastable bulk phase is connected to many unresolved scientific questions. We analyze the properties and stability of multicomponent droplets and bubbles in the canonical ensemble, and compare with single-component systems. The bubbles/droplets are described on the mesoscopic level by square gradient theory. Furthermore, we compare the results to a capillary model which gives a macroscopic description. Remarkably, the solutions of the square gradient model, representing bubbles and droplets, are accurately reproduced by the capillary model except in the vicinity of the spinodals. The solutions of the square gradient model form closed loops, which shows the inherent symmetry and connected nature of bubbles and droplets. A thermodynamic stability analysis is carried out, where the second variation of the square gradient description is compared to the eigenvalues of the Hessian matrix in the capillary description. The analysis shows that it is impossible to stabilize arbitrarily small bubbles or droplets in closed systems and gives insight into metastable regions close to the minimum bubble/droplet radii. Despite the large difference in complexity, the square gradient and the capillary model predict the same finite threshold sizes and very similar stability limits for bubbles and droplets, both for single-component and two-component systems.
Resumo:
Kun kauppaa käydään eri maanosien välillä, törmätään vieraisiin kulttuureihin ja erilaisiin kaupankäyntitapoihin. Tämä tutkimus keskittyy suomalaisten liikemiesten ja - naisten työhön Yhdistyneissä Arabiemiraateissa kulttuurierojen näkökulmasta. Tavoitteena on kuvata kulttuurierojen vaikutuksia kaupankäyntiprosessiin ja löytää niitä ongelmia, joita tämän prosessin aikana kohdataan. Tavoitteena on tuottaa tietoa, jonka avulla kulttuurieroista johtuvia ongelmia voidaan vähentää tulevaisuudessa. Tutkimuksen teoreettinen tausta perustuu Hofsteden kulttuurista vaihtelua kuvaaviin dimensioihin ja Ting Toomeyn kulttuurisen identiteetin neuvotteluprosessin malliin. Näihin malleihin perustuen luotiin tähän tutkimukseen oma kulttuurien välisen kohtaamisen malli. Tutkimusongelmia ovat: 1) Miten suomalaiset liikemiehet kuvailevat arabien kaupantekokulttuuria? Kuinka vastapuoli kuvailee omaa kaupantekokulttuuriaan? 2) Minkälainen on suomalaisten ja arabien välinen kaupankäyntiprosessi? 3) Minkälaisia ongelmia kohdataan tehtäessä kauppaa suomalaisten ja arabien kesken? Tutkimus on etnografinen, laadullinen haastattelututkimus (n=12). Haastattelut tehtiin suurimmaksi osaksi Yhdistyneissä Arabiemiraateissa, osin Suomessa. Erilainen kulttuuritausta näkyy kaupankäynnissä. Kollektivistinen, maskuliininen, islamilaisen uskontoon ja vain vähäisessä määrin suoraan kielelliseen koodistoon perustuva kulttuuri heijastuu kaupankäyntiprosessiin. Ystävyyden ja sukulaisten sekä muiden verkostojen merkitys korostuu. Ruumiin kieleen, ilmeisiin ja eleisiin liittyvä kommunikaatio on erilaista ja voi aiheuttaa väärinkäsityksiä. Myös aikakäsitys ja sopimuskäytäntö poikkeavat suomalaisesta. Kaikki nämä voivat aiheuttaa ongelmia kaupankäyntiprosessissa. Tärkeimmiksi tekijöiksi ongelmien kohtaamisessa nousivat ammattitaidon lisäksi kärsivällisyys ja joustavuus. Ongelmia voidaan vähentää huolellisella valmistautumisella ennen ulkomaille lähtöä. Toinen tapa on jatkuviin työkokemuksiin perustuen oppia paikallista kulttuuria ja sen piirteitä.
Resumo:
This article examines the mainstream categorical definition of coreference as "identity of reference." It argues that coreference is best handled when identity is treated as a continuum, ranging from full identity to non-identity, with room for near-identity relations to explain currently problematic cases. This middle ground is needed to account for those linguistic expressions in real text that stand in relations that are neither full coreference nor non-coreference, a situation that has led to contradictory treatment of cases in previous coreference annotation efforts. We discuss key issues for coreference such as conceptual categorization, individuation, criteria of identity, and the discourse model construct. We redefine coreference as a scalar relation between two (or more) linguistic expressions that refer to discourse entities considered to be at the same granularity level relevant to the linguistic and pragmatic context. We view coreference relations in terms of mental space theory and discuss a large number of real life examples that show near-identity at different degrees.
Resumo:
This study explored ethnic identity among 410 mestizo students who were attending one of three universities, which varied in their ethnic composition and their educative model. One of these universities was private and had mostly mestizo students such as the public one did. The third educative context, also public, had an intercultural model of education and the students were mixed among mestizo and indigenous. The Multigroup Ethnic Identity Measure (MEIM) was administered to high school students in order to compare their scores on ethnic identity and its components: affi rmation, belonging or commitment and exploration. Principle components factor analysis with varimax rotation and tests of mean group differences are performed. The results showed signifi cant differences between the studied groups. Scores on ethnic identity and its components were signifi cantly higher among mestizos group from University with intercultural model of education than mestizos from public and private universities of the same region. Implications of these fi ndings for education are considered, as they are the strengths as well as the limitations of this research