989 resultados para Project Complexity
Resumo:
Background: Beryllium (Be) is increasingly used worldwide for numerous industrial applications. Occupational exposure to Be may lead to Be sensitization (BeS), a CD4-mediated immune response. BeS may progress to chronic beryllium disease (CBD), a granulomatous lung disorder closely resembling sarcoidosis. The recognition of CBD requires detection of Be exposure at occupational history, and detection of BeS on blood or BAL lymphocytes. Since methods for CBD detection are not routinely available in Switzerland, we hypothesized that CBD cases are not recognized but misdiagnosis as sarcoidosis. Objective: To present an ongoing Swiss study screening patients with sarcoidosis in search of Be exposure, BeS, and CBD. Methods: Both a prospective and a retrospective cohort are being studied. In the prospective cohort, the main steps include: 1) recruitment of 100 consecutive patients with newly diagnosed pulmonary sarcoidosis at 2 centers (Lausanne, Bern). 2) screening for possible occupational Be exposure by self-administered patient questionnaire. 3) standardized detailed occupational interview and clinical visit by occupational health specialist. If step 3 is positive, then 4) blood and BAL sampling for detection of BeS by specifically developed Elispot assay and CFSE flow cytometry, with subsequent comparison to the classical Be lymphocyte proliferation test. If step 4 is positive, then 5) review of medical records and diagnostic revision from sarcoidosis to CBD. 6) appropriate measures for exposure cessation and case reporting to SUVA as occupational disease. The retrospective cohort will include 400 patients with previously diagnosed pulmonary sarcoidosis, either treated or untreated, recruited through the SIOLD Registries. Steps 2 to 5 will be peformed as above, except for a) end of study after step 2 if screening questionnaire does not reveal Be exposure, and b) step 4 done on blood sample only (BAL not needed). Current status: Self-administered screening questionnaire and tools for standardized occupational interview have been developed. BeS testing has been implemented and undergoes validation. Inclusions in the prospective phase have started at both study sites. The retrospective phase is in preparation. Conclusion: The current study status allows to conclude to technical feasibility of the project. The prospective phase if this study is funded by the SUVA. The SIOLD Registries are supported by the Swiss Pulmonary League.
Resumo:
Workers performing preparation and administration of radiopharmaceuticals in NM departments are likely to receive high local skin doses to the hands which may even surpass the dose limit of 500 mSv whenever radiation protection standards are insufficient. A large measurement campaign was organised within the framework of the ORAMED project to determine the dose distribution across the hands received during preparation and administration of 18F- and 99mTc-labelled radiopharmaceuticals. The final data, collected over almost 3 years, include 641 measurements from 96 workers in 30 NM departments from 6 European countries. Results have provided levels of reference doses for the considered standard NM diagnostic procedures (mean maximum normalised skin dose of 230 μSv/GBq, 430 μSv/GBq, 930 μSv/GBq and 1200 μSv/GBq for the administration of 99mTc, preparation of 99mTc, administration of 18F and preparation of 18F, respectively). Finger dose was analysed as a function of the potential parameters of influence showing that shielding is the most efficient means of radiation protection to reduce skin dose. An appropriate method for routine monitoring of the extremities is also proposed: the base of the index finger of the non-dominant hand is a suitable position to place the ring dosemeter, with its sensitive part oriented towards the palm side; its reading may be multiplied by a factor of 6 to estimate the maximum local skin dose. Finally, results were compared to earlier published data, which correspond mostly to individual works with a reduced number of workers and measurements.
Resumo:
I develop a model of endogenous bounded rationality due to search costs, arising implicitly from the problems complexity. The decision maker is not required to know the entire structure of the problem when making choices but can think ahead, through costly search, to reveal more of it. However, the costs of search are not assumed exogenously; they are inferred from revealed preferences through her choices. Thus, bounded rationality and its extent emerge endogenously: as problems become simpler or as the benefits of deeper search become larger relative to its costs, the choices more closely resemble those of a rational agent. For a fixed decision problem, the costs of search will vary across agents. For a given decision maker, they will vary across problems. The model explains, therefore, why the disparity, between observed choices and those prescribed under rationality, varies across agents and problems. It also suggests, under reasonable assumptions, an identifying prediction: a relation between the benefits of deeper search and the depth of the search. As long as calibration of the search costs is possible, this can be tested on any agent-problem pair. My approach provides a common framework for depicting the underlying limitations that force departures from rationality in different and unrelated decision-making situations. Specifically, I show that it is consistent with violations of timing independence in temporal framing problems, dynamic inconsistency and diversification bias in sequential versus simultaneous choice problems, and with plausible but contrasting risk attitudes across small- and large-stakes gambles.
Resumo:
This paper critically examines a number of issues relating to the measurement of tax complexity. It starts with an analysis of the concept of tax complexity, distinguishing tax design complexity and operational complexity. It considers the consequences/costs of complexity, and then examines the rationale for measuring complexity. Finally it applies the analysis to an examination of an index of complexity developed by the UK Office of Tax Simplification (OTS).
Resumo:
This technical report is a document prepared as a deliverable [D4.3 Report of the Interlinkages and forecasting prototype tool] of a EU project – DECOIN Project No. 044428 - FP6-2005-SSP-5A. The text is divided into 4 sections: (1) this short introductory section explains the purpose of the report; (2) the second section provides a general discussion of a systemic problem found in existing quantitative analysis of sustainability. It addresses the epistemological implications of complexity, which entails the need of dealing with the existence of Multiple-Scales and non-equivalent narratives (multiple dimensions/attributes) to be used to define sustainability issues. There is an unavoidable tension between a “steady-state view” (= the perception of what is going on now – reflecting a PAST --& PRESENT view of the reality) versus an “evolutionary view” (= the unknown transformation that we have to expect in the process of becoming of the observed reality and in the observer – reflecting a PRESENT --& FUTURE view of the reality). The section ends by listing the implications of these points on the choice of integrated packages of sustainability indicators; (3) the third section illustrates the potentiality of the DECOIN toolkit for the study of sustainability trade-offs and linkages across indicators using quantitative examples taken from cases study of another EU project (SMILE). In particular, this section starts by addressing the existence of internal constraints to sustainability (economic versus social aspects). The narrative chosen for this discussion focuses on the dark side of ageing and immigration on the economic viability of social systems. Then the section continues by exploring external constraints to sustainability (economic development vs the environment). The narrative chosen for this discussion focuses on the dark side of current strategy of economic development based on externalization and the “bubbles-disease”; (4) the last section presents a critical appraisal of the quality of energy data found in energy statistics. It starts with a discussion of the general goal of statistical accounting. Then it introduces the concept of multipurpose grammars. The second part uses the experience made in the activities of the DECOIN project to answer the question: how useful are EUROSTAT energy statistics? The answer starts with an analysis of basic epistemological problems associated with accounting of energy. This discussion leads to the acknowledgment of an important epistemological problem: the unavoidable bifurcations in the mechanism of accounting needed to generate energy statistics. By using numerical example the text deals with the following issues: (i) the pitfalls of the actual system of accounting in energy statistics; (ii) a critical appraisal of the actual system of accounting in BP statistics; (iii) a critical appraisal of the actual system of accounting in Eurostat statistics. The section ends by proposing an innovative method to represent energy statistics which can result more useful for those willing develop sustainability indicators.
Resumo:
OBJECTIVES: To document biopsychosocial profiles of patients with rheumatoid arthritis (RA) by means of the INTERMED and to correlate the results with conventional methods of disease assessment and health care utilization. METHODS: Patients with RA (n = 75) were evaluated with the INTERMED, an instrument for assessing case complexity and care needs. Based on their INTERMED scores, patients were compared with regard to severity of illness, functional status, and health care utilization. RESULTS: In cluster analysis, a 2-cluster solution emerged, with about half of the patients characterized as complex. Complex patients scoring especially high in the psychosocial domain of the INTERMED were disabled significantly more often and took more psychotropic drugs. Although the 2 patient groups did not differ in severity of illness and functional status, complex patients rated their illness as more severe on subjective measures and on most items of the Medical Outcomes Study Short Form 36. Complex patients showed increased health care utilization despite a similar biologic profile. CONCLUSIONS: The INTERMED identified complex patients with increased health care utilization, provided meaningful and comprehensive patient information, and proved to be easy to implement and advantageous compared with conventional methods of disease assessment. Intervention studies will have to demonstrate whether management strategies based on INTERMED profiles can improve treatment response and outcome of complex patients.
Resumo:
Elevated plasma cholesterol, high blood pressure and cigarette smoking are three major risk factors for coronary heart disease. Within the framework of Switzerland's participation in the multicenter study MONICA (MONItoring of trends and determinants in CArdiovascular disease), proposed by the WHO, a first risk factor survey was conducted in a representative sample of the population (25-74 years) of two reporting units (cantons of Vaud and Fribourg, canton of Tessin). A high blood cholesterol level (>6,7 mmol/l) is the most common risk factor for coronary heart disease among the studied population. Among men, about 13% have elevated blood pressure, the proportion being about one in ten among women; these proportions increase with age and are slightly above these values in Tessin. Cigarette smoking is still a common behavior; between 25 and 45 years one third of the population (male and female) regularly smoke cigarettes.
Resumo:
Aquest projecte s’ha desenvolupat en l’àmbit de la seguretat informàtica i té com a objectiu la creació d’una aplicació que permeti la gestió dels certificats digitals de diferents aplicacions i tecnologies a la vegada i de forma conjunta, estalviant a l’usuari gestionar-los de forma individual. Al mateix temps aquest projecte pretén disminuir la complexitat d’alguns aspectes de la seguretat als que no tots els usuaris dels certificats digitals hi estan familiaritzats.
Resumo:
Treball de recerca realitzat per una alumna d'ensenyament secundari i guardonat amb un Premi CIRIT per fomentar l'esperit científic del Jovent l'any 2009. Es tracta d’una recerca experimental en la que s’han assajat vuit tècniques de cultiu in vitro amb clavellina. El material vegetal s’ha esterilitzat per immersió en una solució diluïda de lleixiu i s’ha manipulat de manera estèril. En tots els casos el medi de cultiu utilitzat ha estat el MS amb una concentració de sacarosa i reguladors de creixement variable segons l’experiment. La incubació dels cultius s’han dut a terme en una cambra amb control de fotoperíode durant 4 setmanes. Els diferents reguladors de creixement han mostrat un clar efecte sobre les seccions de tija. Els explants cultivats en medi lliure d’hormones han crescut menys que els exposats a diverses concentracions de NAA i BA. Aquests tractaments hormonals han originat símptomes de creixement anòmals (engruiximents a la base i vitrificació). La presencia de 2,4-D ha afavorit la formació de cal•lus i d’arrels per organogènesi adventícia indirecta. L’obtenció de plàntules per germinació in vitro de llavors ha permès reduir notablement les pèrdues per contaminació, mentre que el subcultiu d’aquestes ha donat unes tases de micropropagació de 7.2 seccions/plàntula. Ha estat possible aclimatar aquestes vitroplantes per tal d’adaptar-les a les condicions de camp. No hem pogut obtenir organogènesis adventícia ni embriogènesi somàtica a partir d anteres ni hem pogut iniciar un cultiu de cèl•lules a partir dels cal•lus. Tot i la complexitat d’aquestes tècniques, és possible dur-les a terme en un laboratori escolar.
Resumo:
Treball de recerca realitzat per una alumna d'ensenyament secundari i guardonat amb un Premi CIRIT per fomentar l'esperit científic del Jovent l'any 2009. El treball es centra en conèixer la complexitat d’un estudi fotogràfic del s. XIX: l’estudi Napoleón. Per entendre tots els aspectes que implicava fer una fotografia en aquest estudi comença explicant com es van desenvolupar i descobrir les diferents tècniques fotogràfiques, després presenta l’estat de la fotografia a la Catalunya del s. XIX. El nucli del treball té diferents aspectes: per una banda s’investiga la història dels fundadors d’un dels estudis més importants a la Barcelona del s. XIX, per l’altra presenta com eren les sales, els decorats, els clients, la tipografia, les càmeres .... i per últim, porta a la pràctica tot allò necessari per a transformar un paper blanc en una fotografia fent servir els mètodes de l’època. Podríem dir que el treball es desenvolupa en tres àmbits: el primer sobre els fonaments tècnics i històrics de la fotografia, les fonts utilitzades per realitzar aquest apartat han estat fonamentalment bibliogràfiques; el segon fa referència a l’estudi fotogràfic dels Napoleón, en aquest cas, a part de les fonts bibliogràfiques, també ha estat de vital importància la informació aportada per un descendent de la família i finalment s’explica els procediments que es van fer servir per obtenir imatges durant el segle s.XIX i les reaccions químiques en les quals es fonamenten. Aporta també una part experimental que dóna un caire artístic i novedós al treball.
Resumo:
The Kilombero Malaria Project (KMP) attemps to define opperationally useful indicators of levels of transmission and disease and health system relevant monitoring indicators to evaluate the impact of disease control at the community or health facility level. The KMP is longitudinal community based study (N = 1024) in rural Southern Tanzania, investigating risk factors for malarial morbidity and developing household based malaria control strategies. Biweekly morbidity and bimonthly serological, parasitological and drug consumption surveys are carried out in all study households. Mosquito densities are measured biweekly in 50 sentinel houses by timed light traps. Determinants of transmission and indicators of exposure were not strongly aggregated within households. Subjective morbidity (recalled fever), objective morbidity (elevated body temperature and high parasitaemia) and chloroquine consumption were strongly aggregated within a few households. Nested analysis of anti-NANP40 antibody suggest that only approximately 30% of the titer variance can explained by household clustering and that the largest proportion of antibody titer variability must be explained by non-measured behavioral determinants relating to an individual's level of exposure within a household. Indicators for evaluation and monitoring and outcome measures are described within the context of health service management to describe control measure output in terms of community effectiveness.
Resumo:
Eusociality is taxonomically rare, yet associated with great ecological success. Surprisingly, studies of environmental conditions favouring eusociality are often contradictory. Harsh conditions associated with increasing altitude and latitude seem to favour increased sociality in bumblebees and ants, but the reverse pattern is found in halictid bees and polistine wasps. Here, we compare the life histories and distributions of populations of 176 species of Hymenoptera from the Swiss Alps. We show that differences in altitudinal distributions and development times among social forms can explain these contrasting patterns: highly social taxa develop more quickly than intermediate social taxa, and are thus able to complete the reproductive cycle in shorter seasons at higher elevations. This dual impact of altitude and development time on sociality illustrates that ecological constraints can elicit dynamic shifts in behaviour, and helps explain the complex distribution of sociality across ecological gradients.
Resumo:
ACuteTox is a project within the 6th European Framework Programme which had as one of its goals to develop, optimise and prevalidate a non-animal testing strategy for predicting human acute oral toxicity. In its last 6 months, a challenging exercise was conducted to assess the predictive capacity of the developed testing strategies and final identification of the most promising ones. Thirty-two chemicals were tested blind in the battery of in vitro and in silico methods selected during the first phase of the project. This paper describes the classification approaches studied: single step procedures and two step tiered testing strategies. In summary, four in vitro testing strategies were proposed as best performing in terms of predictive capacity with respect to the European acute oral toxicity classification. In addition, a heuristic testing strategy is suggested that combines the prediction results gained from the neutral red uptake assay performed in 3T3 cells, with information on neurotoxicity alerts identified by the primary rat brain aggregates test method. Octanol-water partition coefficients and in silico prediction of intestinal absorption and blood-brain barrier passage are also considered. This approach allows to reduce the number of chemicals wrongly predicted as not classified (LD50>2000 mg/kg b.w.).
Resumo:
We report the generation and analysis of functional data from multiple, diverse experiments performed on a targeted 1% of the human genome as part of the pilot phase of the ENCODE Project. These data have been further integrated and augmented by a number of evolutionary and computational analyses. Together, our results advance the collective knowledge about human genome function in several major areas. First, our studies provide convincing evidence that the genome is pervasively transcribed, such that the majority of its bases can be found in primary transcripts, including non-protein-coding transcripts, and those that extensively overlap one another. Second, systematic examination of transcriptional regulation has yielded new understanding about transcription start sites, including their relationship to specific regulatory sequences and features of chromatin accessibility and histone modification. Third, a more sophisticated view of chromatin structure has emerged, including its inter-relationship with DNA replication and transcriptional regulation. Finally, integration of these new sources of information, in particular with respect to mammalian evolution based on inter- and intra-species sequence comparisons, has yielded new mechanistic and evolutionary insights concerning the functional landscape of the human genome. Together, these studies are defining a path for pursuit of a more comprehensive characterization of human genome function.