784 resultados para Green ITC management factors
Resumo:
In the case of atherosclerotic renal artery disease, the best conclusive results lie principally not in the degree of the stenosis but rather in the degree the renal parenchymal disease beyond the stenosis itself. These determining factors involve the controlling of the patients blood pressure, the improvement in the renal function and the beneficial results to the cardiovascular system. Besides the indispensable medical treatment, a revascularisation by angioplasty may be indicated. This procedure with or without vascular stent often allows satisfactory angiographic results. A treatment by surgical revascularisation is only recommended in the case of extensive atherosclerotic lesions of the aorta, complex lesions of the latter or an abdominal aortic aneurism. Although the frequency of restenosis of angioplasty with stent remains extremely low, the risk of cholesterol emboli due to the diffuse atherosclerotic lesions of the abdominal aorta, must be considered at the time of each aortic catheterization. The therapeutic approach of atherosclerotic renal artery disease must be dictated by the whole cardiovascular risk factors and by the threat of target organs. The control of the blood pressure and the maintenance of the renal function must be integrated in the decisional algorithm as well as the possible risks in carrying out an eventual revascularisation procedure. Finally, the renal angioplasty should in numerous situations be integrated in the overall assumption of responsibility of the atherosclerotic vascular diseases, and should be part of the medical treatment. Several questions still do exist; at what moment an atherosclerotic renal artery stenosis should and e considered critical, and which procedure should be considered for which patient? The purpose of this review is to propose a decisional tool for individualized treatments in the light of results from randomized and controlled studies.
Resumo:
The purpose of this bachelor's thesis was to chart scientific research articles to present contributing factors to medication errors done by nurses in a hospital setting, and introduce methods to prevent medication errors. Additionally, international and Finnish research was combined and findings were reflected in relation to the Finnish health care system. Literature review was conducted out of 23 scientific articles. Data was searched systematically from CINAHL, MEDIC and MEDLINE databases, and also manually. Literature was analysed and the findings combined using inductive content analysis. Findings revealed that both organisational and individual factors contributed to medication errors. High workload, communication breakdowns, unsuitable working environment, distractions and interruptions, and similar medication products were identified as organisational factors. Individual factors included nurses' inability to follow protocol, inadequate knowledge of medications and personal qualities of the nurse. Developing and improving the physical environment, error reporting, and medication management protocols were emphasised as methods to prevent medication errors. Investing to the staff's competence and well-being was also identified as a prevention method. The number of Finnish articles was small, and therefore the applicability of the findings to Finland is difficult to assess. However, the findings seem to fit to the Finnish health care system relatively well. Further research is needed to identify those factors that contribute to medication errors in Finland. This is a necessity for the development of methods to prevent medication errors that fit in to the Finnish health care system.
Beyond EA Frameworks: Towards an Understanding of the Adoption of Enterprise Architecture Management
Resumo:
Enterprise architectures (EA) are considered promising approaches to reduce the complexities of growing information technology (IT) environments while keeping pace with an ever-changing business environment. However, the implementation of enterprise architecture management (EAM) has proven difficult in practice. Many EAM initiatives face severe challenges, as demonstrated by the low usage level of enterprise architecture documentation and enterprise architects' lack of authority regarding enforcing EAM standards and principles. These challenges motivate our research. Based on three field studies, we first analyze EAM implementation issues that arise when EAM is started as a dedicated and isolated initiative. Following a design-oriented paradigm, we then suggest a design theory for architecture-driven IT management (ADRIMA) that may guide organizations to successfully implement EAM. This theory summarizes prescriptive knowledge related to embedding EAM practices, artefacts and roles in the existing IT management processes and organization.
Resumo:
The study herein discusses research aimed at elucidating the factors that contribute to a business' ability to maintain high growth. The database from the Iberian Balance Sheet Analysis System (SABI, from its initials in Spanish) was used to identify 250 industrial Catalonian businesses with high growth during 2004-2007. These companies participated in a survey on strategies and management practices; in 2013, they were re-analyzed to investigate the factors that contributed to continued growth for certain companies. Through diverse statistical techniques, business policies related to quality, innovation, internationalization and finance were shown to influence business growth and sustainability over time. High-growth businesses have been studied throughout the world, but this is the first study to investigate the evolution of businesses after a high-growth phase.
Resumo:
Diplomityöntavoitteena on tutkia, kuinka nimiketiedon hallinnalla voidaan parantaa kustannustehokkuutta projektiohjautuvassa toimitusketjussa. Työn kohdeyritys on Konecranes Oyj:n tytäryhtiö Konecranes Heavy Lifting Oy. Nimiketiedon hallinta liittyy läheisesti tuotetiedon hallintaan. Teoriaosassa käsitellään toimitusketjuympäristön tekijöitä, modulaarisuuden ja asiakaskohtaisuuden ongelmallisuutta sekä informaatiovirran vaikutuksia eri toiminnoissa. Yritysosassa vertaillaan konsernitason kahta liiketoiminta-aluetta strategiavalintojen, tuotteiden modulaarisuuden sekä tilaus-toimitusprosessissa liikkuvan nimikeinformaation perusteella. Diplomityön tuloksena annetaan suuntaviivat; nimikemassan eheytykseen, strategisten nimikkeiden tunnistamiseen ja määrittämiseen, nimikkeiden hallintaan sekä master-datan sijoittamiseen tietojärjestelmäympäristöön.
Resumo:
Tässä työssä käsitellään niitä motiiveja, haasteita ja menestystekijöitä, jotka vaikuttavat lisäarvoatuottavassa liiketoimintaverkostossa. Työssä on selvitetty sitä, miten partneriverkostot syntyvät sekä mitkä seikat vaikuttavat siihen jatkuuko yhteistyö vai ei. Motiiveja partneruuteen on tutkittu kirjallisuudesta sekä analysoimalla työssä esitettyä tapausta. Tässä työssä käydään keskustelua myös partneruuden elinkaaresta, jota ei ole käsitellyssä kirjallisuudessa tuotu esille. Työssä esitettyä tapausta arvioitiin lähettämällä siihen liittyneille henkilöille kysely. Kyselyiden lähettämisen jälkeen järjestettiin haastattelu kyselyyn vastanneiden kanssa. Lopputulokset perustuvat pitkälti haastateltujen henkilöiden kanssa käytyihin keskusteluihin. Kävi ilmi, että arvoa tuottavan partneriverkoston yksi tärkeimpiä tavoitteita on saavuttaa jatkuvuutta liiketoiminnallaan. Ainoastaan pitkäaikaisella partneruudella voidaan saavuttaa merkittäviä etuja markkinoilla. Siksi on tärkeätä, jo partnerin valinnassa, kiinnittää huomiota partneruuden jatkuvuuteen pitkällä tähtäimellä. Liiketoimintaverkostossa partneruudesta syntyvät tuotot ja niiden jakaminen on tärkein yksittäinen osaalue. Oleellista partneruuden jatkuvuudelle pitkällä tähtäimellä on jo partneria valittaessa se, että kyetään arvioimaan miten partneruudesta syntyvät tuotot jaetaan tasapuolisesti ja onko partneruudesta syntyvälle liiketoiminnalle jatkuvuutta. Jotta partneriverkostolle asetetut tavoitteet voitaisiin saavuttaa, on tärkeää suunnitella partneriverkoston hallintaa myös operatiivisella tasolla. Lisäksi tärkeää on jakaa verkostolle asetetut yhteiset tavoitteet organisaatioiden sisällä. Jos ylemmänja operatiivisen tason johdon yhteistyö on riittämätöntä, se vaikeuttaa oleellisesti asetettujen tavoitteiden saavuttamista. Tiedon jakaminen aikaisessa vaiheessa sitouttaa eri sidosryhmät paremmin yhteisiin tavoitteisiin.
Resumo:
BACKGROUND AND OBJECTIVE: The Lausanne Stroke Registry includes, from 1979, all patients admitted to the department of Neurology of the Lausanne University Hospital with the diagnosis of first clinical stroke. Using the Lausanne Stroke Registry, we aimed to determine trends in risk factors, causes, localization and inhospital mortality over 25 years in hospitalized stroke patients. METHODS: We assessed temporal trends in stroke patients characteristics through the following consecutive periods: 1979-1987, 1988-1995 and 1996-2003. Age-adjusted cardiovascular risk factors, etiologies, stroke localizations and mortality were compared between the three periods. RESULTS: Overall, 5,759 patients were included. Age was significantly different among the analyzed periods (p < 0.001), showing an increment in older patients throughout time. After adjustment for age, hypercholesterolemia increased (p < 0.001), as opposed to cigarette smoking (p < 0.001), hypertension (p < 0.001) and diabetes and hyperglycemia (p < 0.001). In patients with ischemic strokes, there were significant changes in the distribution of causes with an increase in cardioembolic strokes (p < 0.001), and in the localization of strokes with an increase in entire middle cerebral artery (MCA) and posterior circulation strokes together with a decrease in superficial middle cerebral artery stroke (p < 0.001). In patients with hemorrhagic strokes, the thalamic localizations increased, whereas the proportion of striatocapsular hemorrhage decreased (p = 0.022). Except in the older patient group, the mortality rate decreased. CONCLUSIONS: This study shows major trends in the characteristics of stroke patients admitted to a department of neurology over a 25-year time span, which may result from referral biases, development of acute stroke management and possibly from the evolution of cerebrovascular risk factors.
Resumo:
OBJECTIVES: Resuscitation in severe head injury may be detrimental when given with hypotonic fluids. We evaluated the effects of lactated Ringer's solution (sodium 131 mmol/L, 277 mOsm/L) compared with hypertonic saline (sodium 268 mmol/L, 598 mOsm/L) in severely head-injured children over the first 3 days after injury. DESIGN: An open, randomized, and prospective study. SETTING: A 16-bed pediatric intensive care unit (ICU) (level III) at a university children's hospital. PATIENTS: A total of 35 consecutive children with head injury. INTERVENTIONS: Thirty-two children with Glasgow Coma Scores of <8 were randomly assigned to receive either lactated Ringer's solution (group 1) or hypertonic saline (group 2). Routine care was standardized, and included the following: head positioning at 30 degrees; normothermia (96.8 degrees to 98.6 degrees F [36 degrees to 37 degrees C]); analgesia and sedation with morphine (10 to 30 microg/kg/hr), midazolam (0.2 to 0.3 mg/kg/hr), and phenobarbital; volume-controlled ventilation (PaCO2 of 26.3 to 30 torr [3.5 to 4 kPa]); and optimal oxygenation (PaO2 of 90 to 105 torr [12 to 14 kPa], oxygen saturation of >92%, and hematocrit of >0.30). MEASUREMENTS AND MAIN RESULTS: Mean arterial pressure and intracranial pressure (ICP) were monitored continuously and documented hourly and at every intervention. The means of every 4-hr period were calculated and serum sodium concentrations were measured at the same time. An ICP of 15 mm Hg was treated with a predefined sequence of interventions, and complications were documented. There was no difference with respect to age, male/female ratio, or initial Glasgow Coma Score. In both groups, there was an inverse correlation between serum sodium concentration and ICP (group 1: r = -.13, r2 = .02, p < .03; group 2: r = -.29, r2 = .08, p < .001) that disappeared in group 1 and increased in group 2 (group 1: r = -.08, r2 = .01, NS; group 2: r = -.35, r2 =.12, p < .001). Correlation between serum sodium concentration and cerebral perfusion pressure (CPP) became significant in group 2 after 8 hrs of treatment (r = .2, r2 = .04, p = .002). Over time, ICP and CPP did not significantly differ between the groups. However, to keep ICP at <15 mm Hg, group 2 patients required significantly fewer interventions (p < .02). Group 1 patients received less sodium (8.0 +/- 4.5 vs. 11.5 +/- 5.0 mmol/kg/day, p = .05) and more fluid on day 1 (2850 +/- 1480 vs. 2180 +/- 770 mL/m2, p = .05). They also had a higher frequency of acute respiratory distress syndrome (four vs. 0 patients, p = .1) and more than two complications (six vs. 1 patient, p = .09). Group 2 patients had significantly shorter ICU stay times (11.6 +/- 6.1 vs. 8.0 +/- 2.4 days; p = .04) and shorter mechanical ventilation times (9.5 +/- 6.0 vs. 6.9 +/- 2.2 days; p = .1). The survival rate and duration of hospital stay were similar in both groups. CONCLUSIONS: Treatment of severe head injury with hypertonic saline is superior to that treatment with lactated Ringer's solution. An increase in serum sodium concentrations significantly correlates with lower ICP and higher CPP. Children treated with hypertonic saline require fewer interventions, have fewer complications, and stay a shorter time in the ICU.
Resumo:
This paper reviews the literature on managerially actionable new product development success factors and summarises the field in a classic managerial framework. Because of the varying quality, breadth and scope of the field, the review only contains post-1980 studies of tangible product development that are of a rigorous scientific standard. Success is interpreted as a commercial success. The field has gained insight into a broad set of factors that vary in scope, abstraction and context. Main areas that contribute to NPD success are top management support exhibited through resource allocation and communicating the strategic importance of NPD in the organisation. The right projects need to be selected for investment at the beginning of the process and should be aligned to the organisation's internal competencies and the external environment. The NPD process should use cross-functional teams and a competent project champions. Marketing research competency is crucial, as an understanding of the market, customers and competitors is repeatedly highlighted. Product launch competency was also consistently shown to be important. In terms of controlling the NPD process, strict project gates are required to maintain control.
Resumo:
La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.
Resumo:
Résumé Contexte: Bon nombre d'études épidémiologiques concernant les premières crises comitiales ont été effectuées principalement sur des populations générales. Cependant, les patients admis dans un hôpital peuvent présenter des éléments cliniques différents. Nous avons donc mené une étude prospective auprès de sujets dans une population hospitalière ayant subi une première crise d'épilepsie, afin d'étudier leur pronostic et le rôle des examens complémentaires (examen neurologique, imagerie cérébrale, examens sanguins, EEG) dans le choix de l'administration d'une médication antiépileptique. Méthodes : Sur une période d'une année, nous avons suivi 177 patients adultes, admis consécutivement, ayant présenté une crise d'épilepsie dont l'évaluation aiguë a été effectuée dans notre hôpital. Pendant 6 mois, nous avons pratiqué pour chaque patient un suivi du traitement antiépileptique, des récidives de crises et d'un éventuel décès. Résultats : L'examen neurologique était anormal dans 72.3% des cas, l'imagerie cérébrale dans 54.8% et les examens sanguins dans 57.1%. L'EEG a montré des éléments épileptiformes dans 33.9% des cas. L'étiologie la plus fréquemment représentée était constituée par des intoxications. Un traitement antiépileptique a été prescrit chez 51% des patients. 31.6% des sujets suivis à six mois ont subi une récidive ; la mortalité s'est élevée à 17.8%. Statistiquement, l'imagerie cérébrale, l'EEG et l'examen neurologique étaient des facteurs prédictifs indépendants pour l'administration d'antiépileptiques, et l'imagerie cérébrale le seul facteur associé au pronostic. Conclusions : Les patients évalués en aigu dans un hôpital pour une première crise comitiale présentent un profil médical sous-jacent, qui explique probablement leur mauvais pronostic. L'imagerie cérébrale s'est avérée être le test paraclinique le plus important dans la prévention du traitement et du pronostic. Mots-clés : première crise d'épilepsie, étiologie, pronostic, récidive, médication antiépileptique, population hospitalière Summary Background: Epidemiological studies focusing on first-ever seizures have been carried out mainly on community based populations. However, since hospital populations may display varying clinical features, we prospectively analysed patients with first-ever seizure in a hospital based community to evaluate prognosis and the role of complementary investigations in the decision to administer antiepileptic drugs (AED). Methods: Over one year, we recruited 177 consecutive adult patients with a first seizure acutely evaluated in our hospital. During six months' follow-up data relating to AED treatment, recurrence of seizures and death were collected for each patient. Results:. Neurological examination was abnormal in 72.3%, neuroimaging in 54.8% and biochemical tests in 57.1%. Electroencephalogram (EEG) showed epileptiform features in 33.9%. Toxicity represented the most common aetiology. AED was prescribed in 51% of patients. Seizure recurrence at six months involved 31.6% of patients completing the follow-up; mortality was 17.8%. Statistical analysis showed that brain CT, EEG and neurological examination are independent predictive factors for AED administration, but only CT scan is associated with outcome. Conclusions: Patients evaluated acutely for first- ever seizure in a hospital setting have severe underlying clinical conditions apparently related to their relatively poor prognosis. Neuroimaging represents the most important paraclinical test in predicting both treatment administration and outcome.
Resumo:
The main research problem of this thesis is to find out the means of promoting the recovery of packaging waste generated in thefast food industry. The recovery of packaging waste generated in the fast food industry is demanded by the packaging waste legislation and expected by the public. The means are revealed by the general factors influencing the recovery of packaging waste, analysed by a multidisciplinary literature review and a case study focusing on the packaging waste managementof McDonald's Oy operating in Finland. The existing solid waste infrastructure does not promote the recovery ofpackaging waste generated in the fast food industry. The theoretical recovery rate of the packaging waste is high, 93 %, while the actual recovery rate is only 29 % consisting of secondary packaging manufactured from cardboard. The total recovery potential of packaging waste is 64 %, resulting in 1 230 tonnes ofrecoverable packaging waste. The achievable recovery potential of 33 %, equalling 647 tonnes of packaging waste could be recovered, but is not recovered mainly because of non-working waste management practises. The theoretical recovery potential of 31 %, equalling 583 tonnes of packaging waste can not be recovered by the existing solid waste infrastructure because of the obscure status of commecial waste, the improper operation ofproducer organisations, and the municipal autonomy. The sorting experiment indicated that it is possible to reach the achievable recovery potential inthe existing solid waste infrastructure. The achievement is promoted by waste producer -oriented waste management practises. The theoretical recovery potential can be reached by increasing the consistency of the solid waste infrastructure through governmental action.
Resumo:
En España, las plagas de los espacios verdes urbanos suponen cada año un notable esfuerzo de control y la aplicación de plaguicidas es la estrategia de control empleada casi exclusivamente, con los subsiguientes riesgos que conlleva para las personas, animales y el medio ambiente. El control integrado es una alternativa pero requiere, para poder ser aplicada, un conocimiento profundo de las especies-plaga implicadas, de su biología, dinámica de poblaciones, daños, metodologías de muestreo y posibles sistemas de control. Esta información es difícil de encontrar en España debido a que existen pocos trabajos publicados sobre plagas de zonas verdes urbanas llevados a cabo de una forma sistematizada y a medio o largo plazo. En el presente artículo se analizan los condicionantes del control integrado de plagas en espacios verdes urbanos y se presenta un ejemplo, a partir de los estudios llevados a cabo en la ciudad de Lleida durante el período 2001-2003, de cómo la información básica necesaria para la implementación de posibles programas de control integrado puede ser obtenida.
Resumo:
BACKGROUND: The impact of the Integrated Management of Childhood Illness (IMCI) strategy has been less than anticipated because of poor uptake. Electronic algorithms have the potential to improve quality of health care in children. However, feasibility studies about the use of electronic protocols on mobile devices over time are limited. This study investigated constraining as well as facilitating factors that influence the uptake of a new electronic Algorithm for Management of Childhood Illness (ALMANACH) among primary health workers in Dar es Salaam, Tanzania. METHODS: A qualitative approach was applied using in-depth interviews and focus group discussions with altogether 40 primary health care workers from 6 public primary health facilities in the three municipalities of Dar es Salaam, Tanzania. Health worker's perceptions related to factors facilitating or constraining the uptake of the electronic ALMANACH were identified. RESULTS: In general, the ALMANACH was assessed positively. The majority of the respondents felt comfortable to use the devices and stated that patient's trust was not affected. Most health workers said that the ALMANACH simplified their work, reduced antibiotic prescription and gave correct classification and treatment for common causes of childhood illnesses. Few HWs reported technical challenges using the devices and complained about having had difficulties in typing. Majority of the respondents stated that the devices increased the consultation duration compared to routine practice. In addition, health system barriers such as lack of staff, lack of medicine and lack of financial motivation were identified as key reasons for the low uptake of the devices. CONCLUSIONS: The ALMANACH built on electronic devices was perceived to be a powerful and useful tool. However, health system challenges influenced the uptake of the devices in the selected health facilities.