562 resultados para Workflow


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cone beam computed tomography (CBCT) can be considered as a valuable imaging modality for improving diagnosis and treatment planning to achieve true guidance for several craniofacial surgical interventions. A new concept and perspective in medical informatics is the highlight discussion about the new imaging interactive workflow. The aim of this article was to present, in a short literature review, the usefulness of CBCT technology as an important alternative imaging modality, highlighting current practices and near-term future applications in cutting-edge thought-provoking perspectives for craniofacial surgical assessment. This article explains the state of the art of CBCT improvements, medical workstation, and perspectives of the dedicated unique hardware and software, which can be used from the CBCT source. In conclusion, CBCT technology is developing rapidly, and many advances are on the horizon. Further progress in medical workstations, engineering capabilities, and improvement in independent software-some open source-should be attempted with this new imaging method. The perspectives, challenges, and pitfalls in CBCT will be delineated and evaluated along with the technological developments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sistemas de gestão desenvolvidos para a web, a partir de metadados, permitem manutenção eficiente de grandes quantidades de informação. Um vocabulário controlado como o utilizado pelo Sistema Integrado de Bibliotecas da USP (SIBi/USP) necessita de atualização contínua realizada através de uma rede colaborativa com a participação de bibliotecários indexadores de todas as áreas do conhecimento. Este trabalho apresenta os resultados obtidos com o sistema de gestão desenvolvido pelo Grupo de Gerenciamento para a manutenção do Vocabulário Controlado do SIBi/USP. O fluxo deste sistema consiste em filtros de validação realizados pelos componentes do Grupo de Gerenciamento do Vocabulário. A metodologia de gestão do Vocabulário possui além deste sistema, uma política de governança. Os resultados obtidos nos seis anos desde a ativação do sistema de gestão pela Base de Sugestões consistiram em: 1192 inclusões de descritores, 240 alterações, 61 exclusões, totalizando 1493 operações. A gestão e o controle de qualidade do Vocabulário permitiram o aprimoramento do tratamento e da recuperação da informação no Banco de Dados Bibliográficos da USP – DEDALUS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aspects related to the users' cooperative work are not considered in the traditional approach of software engineering, since the user is viewed independently of his/her workplace environment or group, with the individual model generalized to the study of collective behavior of all users. This work proposes a process for software requirements to address issues involving cooperative work in information systems that provide distributed coordination in the users' actions and the communication among them occurs indirectly through the data entered while using the software. To achieve this goal, this research uses ergonomics, the 3C cooperation model, awareness and software engineering concepts. Action-research is used as a research methodology applied in three cycles during the development of a corporate workflow system in a technological research company. This article discusses the third cycle, which corresponds to the process that deals with the refinement of the cooperative work requirements with the software in actual use in the workplace, where the inclusion of a computer system changes the users' workplace, from the face to face interaction to the interaction mediated by the software. The results showed that the highest degree of users' awareness about their activities and other system users contribute to a decrease in their errors and in the inappropriate use of the system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis proposes a new document model, according to which any document can be segmented in some independent components and transformed in a pattern-based projection, that only uses a very small set of objects and composition rules. The point is that such a normalized document expresses the same fundamental information of the original one, in a simple, clear and unambiguous way. The central part of my work consists of discussing that model, investigating how a digital document can be segmented, and how a segmented version can be used to implement advanced tools of conversion. I present seven patterns which are versatile enough to capture the most relevant documents’ structures, and whose minimality and rigour make that implementation possible. The abstract model is then instantiated into an actual markup language, called IML. IML is a general and extensible language, which basically adopts an XHTML syntax, able to capture a posteriori the only content of a digital document. It is compared with other languages and proposals, in order to clarify its role and objectives. Finally, I present some systems built upon these ideas. These applications are evaluated in terms of users’ advantages, workflow improvements and impact over the overall quality of the output. In particular, they cover heterogeneous content management processes: from web editing to collaboration (IsaWiki and WikiFactory), from e-learning (IsaLearning) to professional printing (IsaPress).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coordinating activities in a distributed system is an open research topic. Several models have been proposed to achieve this purpose such as message passing, publish/subscribe, workflows or tuple spaces. We have focused on the latter model, trying to overcome some of its disadvantages. In particular we have applied spatial database techniques to tuple spaces in order to increase their performance when handling a large number of tuples. Moreover, we have studied how structured peer to peer approaches can be applied to better distribute tuples on large networks. Using some of these result, we have developed a tuple space implementation for the Globus Toolkit that can be used by Grid applications as a coordination service. The development of such a service has been quite challenging due to the limitations imposed by XML serialization that have heavily influenced its design. Nevertheless, we were able to complete its implementation and use it to implement two different types of test applications: a completely parallelizable one and a plasma simulation that is not completely parallelizable. Using this last application we have compared the performance of our service against MPI. Finally, we have developed and tested a simple workflow in order to show the versatility of our service.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN]In this paper we present a video streaming solution. It describes the functionality of streaming server at our university and the workflow to embed videos into the learning platform Moodle. By the server-side role management teachers can get access to upload their recorded lectures to the server. Videos will be converted into different streaming formats and embed into the learning platform with provided embed code. The number of teachers who use this proposal is continuously growing at our university. Beside the lecture scripts the videos watched often by the students what the numbers of video accesses in the Moodle course and the streaming server traffic significantly show.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background. One of the phenomena observed in human aging is the progressive increase of a systemic inflammatory state, a condition referred to as “inflammaging”, negatively correlated with longevity. A prominent mediator of inflammation is the transcription factor NF-kB, that acts as key transcriptional regulator of many genes coding for pro-inflammatory cytokines. Many different signaling pathways activated by very diverse stimuli converge on NF-kB, resulting in a regulatory network characterized by high complexity. NF-kB signaling has been proposed to be responsible of inflammaging. Scope of this analysis is to provide a wider, systemic picture of such intricate signaling and interaction network: the NF-kB pathway interactome. Methods. The study has been carried out following a workflow for gathering information from literature as well as from several pathway and protein interactions databases, and for integrating and analyzing existing data and the relative reconstructed representations by using the available computational tools. Strong manual intervention has been necessarily used to integrate data from multiple sources into mathematically analyzable networks. The reconstruction of the NF-kB interactome pursued with this approach provides a starting point for a general view of the architecture and for a deeper analysis and understanding of this complex regulatory system. Results. A “core” and a “wider” NF-kB pathway interactome, consisting of 140 and 3146 proteins respectively, were reconstructed and analyzed through a mathematical, graph-theoretical approach. Among other interesting features, the topological characterization of the interactomes shows that a relevant number of interacting proteins are in turn products of genes that are controlled and regulated in their expression exactly by NF-kB transcription factors. These “feedback loops”, not always well-known, deserve deeper investigation since they may have a role in tuning the response and the output consequent to NF-kB pathway initiation, in regulating the intensity of the response, or its homeostasis and balance in order to make the functioning of such critical system more robust and reliable. This integrated view allows to shed light on the functional structure and on some of the crucial nodes of thet NF-kB transcription factors interactome. Conclusion. Framing structure and dynamics of the NF-kB interactome into a wider, systemic picture would be a significant step toward a better understanding of how NF-kB globally regulates diverse gene programs and phenotypes. This study represents a step towards a more complete and integrated view of the NF-kB signaling system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

“Cartographic heritage” is different from “cartographic history”. The second term refers to the study of the development of surveying and drawing techniques related to maps, through time, i.e. through different types of cultural environment which were background for the creation of maps. The first term concerns the whole amount of ancient maps, together with these different types of cultural environment, which the history has brought us and which we perceive as cultural values to be preserved and made available to many users (public, institutions, experts). Unfortunately, ancient maps often suffer preservation problems of their analog support, mostly due to aging. Today, metric recovery in digital form and digital processing of historical cartography allow preserving map heritage. Moreover, modern geomatic techniques give us new chances of using historical information, which would be unachievable on analog supports. In this PhD thesis, the whole digital processing of recovery and elaboration of ancient cartography is reported, with a special emphasis on the use of digital tools in preservation and elaboration of cartographic heritage. It is possible to divide the workflow into three main steps, that reflect the chapter structure of the thesis itself: • map acquisition: conversion of the ancient map support from analog to digital, by means of high resolution scanning or 3D surveying (digital photogrammetry or laser scanning techniques); this process must be performed carefully, with special instruments, in order to reduce deformation as much as possible; • map georeferencing: reproducing in the digital image the native metric content of the map, or even improving it by selecting a large number of still existing ground control points; this way it is possible to understand the projection features of the historical map, as well as to evaluate and represent the degree of deformation induced by the old type of cartographic transformation (that can be unknown to us), by surveying errors or by support deformation, usually all errors of too high value with respect to our standards; • data elaboration and management in a digital environment, by means of modern software tools: vectorization, giving the map a new and more attractive graphic view (for instance, by creating a 3D model), superimposing it on current base maps, comparing it to other maps, and finally inserting it in GIS or WebGIS environment as a specific layer. The study is supported by some case histories, each of them interesting from the point of view of one digital cartographic elaboration step at least. The ancient maps taken into account are the following ones: • three maps of the Po river delta, made at the end of the XVI century by a famous land-surveyor, Ottavio Fabri (he is single author in the first map, co-author with Gerolamo Pontara in the second map, co-author with Bonajuto Lorini and others in the third map), who wrote a methodological textbook where he explains a new topographical instrument, the squadra mobile (mobile square) invented and used by himself; today all maps are preserved in the State Archive of Venice; • the Ichnoscenografia of Bologna by Filippo de’ Gnudi, made in the 1702 and today preserved in the Archiginnasio Library of Bologna; it is a scenographic view of the city, captured in a bird’s eye flight, but also with an icnographic value, as the author himself declares; • the map of Bologna by the periti Gregorio Monari and Antonio Laghi, the first map of the city derived from a systematic survey, even though it was made only ten years later (1711–1712) than the map by de’ Gnudi; in this map the scenographic view was abandoned, in favor of a more correct representation by means of orthogonal projection; today the map is preserved in the State Archive of Bologna; • the Gregorian Cadastre of Bologna, made in 1831 and updated until 1927, now preserved in the State Archive of Bologna; it is composed by 140 maps and 12 brogliardi (register volumes). In particular, the three maps of the Po river delta and the Cadastre were studied with respect to their acquisition procedure. Moreover, the first maps were analyzed from the georeferencing point of view, and the Cadastre was analyzed with respect to a possible GIS insertion. Finally, the Ichnoscenografia was used to illustrate a possible application of digital elaboration, such as 3D modeling. Last but not least, we must not forget that the study of an ancient map should start, whenever possible, from the consultation of the precious original analogical document; analysis by means of current digital techniques allow us new research opportunities in a rich and modern multidisciplinary context.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To date the hospital radiological workflow is completing a transition from analog to digital technology. Since the X-rays digital detection technologies have become mature, hospitals are trading on the natural devices turnover to replace the conventional screen film devices with digital ones. The transition process is complex and involves not just the equipment replacement but also new arrangements for image transmission, display (and reporting) and storage. This work is focused on 2D digital detector’s characterization with a concern to specific clinical application; the systems features linked to the image quality are analyzed to assess the clinical performances, the conversion efficiency, and the minimum dose necessary to get an acceptable image. The first section overviews the digital detector technologies focusing on the recent and promising technological developments. The second section contains a description of the characterization methods considered in this thesis categorized in physical, psychophysical and clinical; theory, models and procedures are described as well. The third section contains a set of characterizations performed on new equipments that appears to be some of the most advanced technologies available to date. The fourth section deals with some procedures and schemes employed for quality assurance programs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Il presente studio si concentra sulle diverse applicazioni del telerilevamento termico in ambito urbano. Vengono inizialmente descritti la radiazione infrarossa e le sue interazioni con l’atmosfera terrestre, le leggi principali che regolano lo scambio di calore per irraggiamento, le caratteristiche dei sensori e le diverse applicazioni di termografia. Successivamente sono trattati nel dettaglio gli aspetti caratteristici della termografia da piattaforma satellitare, finalizzata principalmente alla valutazione del fenomeno dell'Urban Heat Island; vengono descritti i sensori disponibili, le metodologie di correzione per gli effetti atmosferici, per la stima dell'emissività delle superfici e per il calcolo della temperatura superficiale dei pixels. Viene quindi illustrata la sperimentazione effettuata sull'area di Bologna mediante immagini multispettrali ASTER: i risultati mostrano come sull'area urbana sia riscontrabile la presenza dell'Isola di Calore Urbano, anche se la sua quantificazione risulta complessa. Si procede quindi alla descrizione di potenzialità e limiti della termografia aerea, dei suoi diversi utilizzi, delle modalità operative di rilievo e degli algoritmi utilizzati per il calcolo della temperatura superficiale delle coperture edilizie. Tramite l’analisi di alcune esperienze precedenti vengono trattati l’influenza dell’atmosfera, la modellazione dei suoi effetti sulla radianza rilevata, i diversi metodi per la stima dell’emissività. Viene quindi introdotto il progetto europeo Energycity, finalizzato alla creazione di un sistema GeoWeb di supporto spaziale alle decisioni per la riduzione di consumi energetici e produzione di gas serra su sette città dell'Europa Centrale. Vengono illustrate le modalità di rilievo e le attività di processing dei datasets digitali per la creazione di mappe di temperatura superficiale da implementare nel sistema SDSS. Viene infine descritta la sperimentazione effettuata sulle immagini termiche acquisite nel febbraio 2010 sulla città di Treviso, trasformate in un mosaico georiferito di temperatura radiometrica tramite correzioni geometriche e radiometriche; a seguito della correzione per l’emissività quest’ultimo verrà trasformato in un mosaico di temperatura superficiale.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Questo elaborato dscrive tutti i passagli del workflow che caratterizzano un sistema dipartimentale di oculistica.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research aims at developing a framework for semantic-based digital survey of architectural heritage. Rooted in knowledge-based modeling which extracts mathematical constraints of geometry from architectural treatises, as-built information of architecture obtained from image-based modeling is integrated with the ideal model in BIM platform. The knowledge-based modeling transforms the geometry and parametric relation of architectural components from 2D printings to 3D digital models, and create large amount variations based on shape grammar in real time thanks to parametric modeling. It also provides prior knowledge for semantically segmenting unorganized survey data. The emergence of SfM (Structure from Motion) provides access to reconstruct large complex architectural scenes with high flexibility, low cost and full automation, but low reliability of metric accuracy. We solve this problem by combing photogrammetric approaches which consists of camera configuration, image enhancement, and bundle adjustment, etc. Experiments show the accuracy of image-based modeling following our workflow is comparable to that from range-based modeling. We also demonstrate positive results of our optimized approach in digital reconstruction of portico where low-texture-vault and dramatical transition of illumination bring huge difficulties in the workflow without optimization. Once the as-built model is obtained, it is integrated with the ideal model in BIM platform which allows multiple data enrichment. In spite of its promising prospect in AEC industry, BIM is developed with limited consideration of reverse-engineering from survey data. Besides representing the architectural heritage in parallel ways (ideal model and as-built model) and comparing their difference, we concern how to create as-built model in BIM software which is still an open area to be addressed. The research is supposed to be fundamental for research of architectural history, documentation and conservation of architectural heritage, and renovation of existing buildings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Seit Jahren werden Diskussionen über Erfolgskontrolle in der kommunalen Wirtschaftsförderung geführt. Im Vordergrund steht dabei die Suche nach Indikatoren und Verfahren, die es den kommunalen Wirtschaftsförderungen ermöglichen sollen, Erfolge zu messen. rnDa die Wirtschaftsförderung zu den freiwilligen Leistungen einer Gemeinde zählt, erhöht sich der Druck der Rechtfertigung gegenüber der Öffentlichkeit oder der Politik, das gilt insbesondere in Zeiten knapper öffentlicher Haushalte. Firmenansiedlungen, eine positive wirtschaftliche Entwicklung oder eine geringe Arbeitslosenquote sind sowohl im öffentlichen Bewusstsein als auch in der Politik wesentliche Kriterien einer erfolgreichen Wirtschaftsförderung. Sich ständig ändernde Rahmenbedingungen im wirtschaftsstrukturellen Gefüge haben dazu geführt, dass diese klassischen Nachweise von Erfolg immer seltener als solche präsentiert werden können. Erfolge sollten dennoch gemessen werden, um Maßnahmen und Instrumente einer kommunalen Wirtschaftsförderung zu überprüfen und gegebenenfalls an die geänderten Bedingungen anzupassen. rnEs ist schon mehr als 30 Jahre her, als in den 1970er Jahren die Suche nach Methoden und Verfahren der Erfolgskontrolle in der öffentlichen Verwaltung begann. Erfolge von kommunaler Wirtschaftsförderung können nicht einfach und ausschließlich an den markantesten wirtschaftlichen Ziffern der Kommune gemessen werden, z. B. der Zahl der sozialversicherungspflichtigen Arbeitsplätze. Seit Jahren wird um einen Lösungsweg bei der Durchführung von Erfolgskontrolle in der kommunalen Wirtschaftsförderung gerungen, abschließend wurde jedoch noch kein vollends befriedigend praktikabler Weg gefunden. Zu hinterfragen ist vor dem Hintergrund, inwiefern die vier Elemente einer Erfolgskontrolle, nämlich die Zielerreichungs-, Vollzugs-, Bedingungs- und Wirkungskontrolle, tatsächlich und hinreichend zum Einsatz kommen können.rnDie vorliegenden empirischen Untersuchungen beleuchten nun das Thema aus Sicht der kommunalen Wirtschaftsförderer und liefern Ergebnisse, die zu einem veränderten Bewusstsein gegenüber der Durchführung von Erfolgskontrolle in der kommunalen Wirtschaftsförderung führen müssten. Unabhängig von der Organisationsform und der Größe einer kommunalen Wirtschaftsförderung lässt sich empirisch nachweisen, dass der Anspruch, den der Begriff der Erfolgskontrolle in seiner gängigen Beschreibung erhebt, nicht hinreichend von einer kommunalen Wirtschaftsförderung erfüllt werden kann. rnMit Hilfe des neu entwickelten Prozesses einer modifizierten Erfolgskontrolle wird in vorliegender Arbeit ein idealtypischer Ablauf für eine kommunale Wirtschaftsförderung dargestellt. Der neue Ansatz einer modifizierten Erfolgskontrolle ist eine konsequente Reduzierung der Anforderungen auf das Praktikable und führt dazu, dass Erfolge der kommunalen Wirtschaftsförderung dargestellt werden können, ohne dass das Verfahren mehr Fragen offen lässt, als es beantwortet. Durch die modifizierte Erfolgskontrolle können die spezifischen Erfolge einer kommunalen Wirtschaftsförderung dargestellt und dokumentiert werden. rnEine modifizierte Erfolgskontrolle kann zweierlei: Sie ist eine Hilfestellung für die politisch Verantwortlichen bei der Erkenntnis, dass eine Notwendigkeit nach konkreten und der Ist-Situation sowie den Randbedingungen angepassten Zielformulierungen besteht. Sie bietet aber auch eine Möglichkeit, dass die kommunalen Wirtschaftsförderungseinrichtungen dem in der öffentlichen Diskussion formulierten Anspruch nach Erfolgskontrolle mit einem hohen Grad an Praktikabilität gerecht werden können. rnBevor also viele kommunale Wirtschaftsförderungen durch die fragwürdige Forderung an eine Erfolgskontrolle aufgrund der zu hohen Anforderungen an Methodik, Zeit und Personal aufgeben, sollte ihnen die Konzentration auf das Praktikable wieder Anreiz sein, eine modifizierte Erfolgskontrolle nach dem neuen Prozessschema in Angriff zu nehmen. rnrn

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Das Glaukom ist, nach dem Katarakt, die zweithäufigste Ursache für Erblindungen weltweit mit Milionen von Betroffenen, die von dieser zunächst weitgehend symptomfreien neurodegenerativen Erkrankung heimgesucht werden. Die Möglichkeiten auf dem Feld der Diagnose beschränken sich bislang weitestgehend auf die Messung des Augeninnendrucks und der Beurteilung des Augenhintergrundes durch einen erfahrenen Augenarzt. Eine labordiagnostische Prophylaxe ist bis heute nicht verfügbar, die Zahl unerkannter Erkrankungen dementsprechend hoch. Hierdurch geht wertvolle Zeit verloren, die man für eine effektive Therapie nutzen könnte.rnBezüglich der Pathogenese des Glaukoms geht man heute von mehreren, miteinander wechselwirkenden Pathomechanismen aus, zu denen neben mechanischen Einflüssen durch einen erhöhten IOD auch Hypoxie, verminderte Neutrophinversorgung, Exzitotoxizität, oxidativer Stress und eine Beteiligung autoimmuner Prozesse gezählt werden. Unabhängig vom Pathomechanismus folgt stets die Etablierung umfangreicher degenerativer Prozesse im Sehnervenkopf, den retinalen Ganglienzellen und den Axonen des Sehnerven, die letztlich im irreversiblen Untergang dieser Neuronen münden. Diese pathologischen Prozesse im ZNS hinterlassen auf Proteomebene Spuren, die mithilfe moderner massenspektrometrischer Methoden in Kombination mit multivariaten statistischen Methoden detektierbar und als sogenannte Biomarker-Kandidaten mit definiertem Molekulargewicht darstellbar sind. In dieser Arbeit wurde ein „Workflow“ entwickelt, der es ermöglicht, diese Biomarker-Kandidaten im Blutserum und in der Tränenflüssigkeit in einfachen, reproduzierbaren Schritten zu identifizieren und zu charakterisieren. Abweichend von der etablierten Methotik der Bottom-Up-Proteomics musste hierfür eine Methode entsprechend einer Top-Down-Philosophie entwickelt werden, die es erlaubt, die Spuren des Glaukoms im Proteom zu detektieren und zu charakterisieren.rnDies erfolgte in dieser Arbeit durch sowohl massenspektroskopischen Methoden wie SELDI-TOF® und MALDI-Tof-Tof als auch durch Bead-, Gel- und Flüssigkeits-chromatographisch-basierte Separations und Fraktionierungstechniken.rnDie erfolgreiche Kombination dieser Methoden führte zu Identifikationen einer ganzen Reihe von Biomarker-Kandidaten. Unter den identifizierten Proteinen, die bezüglich ihres korrespondierenden SELDI-Peaks im Massenbereich von Biomarker-Kandidaten liegen, finden sich Zytokine und Effektormoleküle der angeborernen Immunität, stressinduzierbare Kinasen, Faktoren, die zum Schutz der Telomeren dienen, Proliferationsmarker, neuronale Antigene und Transportproteine. Darüber hinaus wurden Komponenten identifiziert, die an der neuronalen Neutrophinversorgung beteiligt sind, neuronale Rezeptoren und Antigene, Komponenten des Komplementsystems und des MHC-I-Komplexes. All diese identifizierten Proteine sind bezüglich ihrer Funktion und möglichen Rolle innerhalb der Pathogenese des Glaukoms detailliert beschrieben und charakterisiert. Dies erlaubt einen umfassenden Einblick in alle Pathomechanismen, denen nach heutigem Kenntnisstand, eine Rolle an der Pathogenese des Glaukoms unterstellt wird.rn

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diese Dissertation basiert auf einem theoretischen Artikel und zwei empirischen Studien.rnrnDer theoretische Artikel: Es wird ein theoretisches Rahmenmodell postuliert, welches die Kumulierung von Arbeitsunterbrechung und deren Effekte untersucht. Die meisten bisherigen Studien haben Unterbrechungen als isoliertes Phänomen betrachtet und dabei unberücksichtigt gelassen, dass während eines typischen Arbeitstages mehrere Unterbrechungen gleichzeitig (oder aufeinanderfolgend) auftreten. In der vorliegenden Dissertation wird diese Lücke gefüllt, indem der Prozess der kumulierenden Unterbrechungen untersucht wird. Es wird beschrieben,rninwieweit die Kumulation von Unterbrechungen zu einer neuen Qualität vonrn(negativen) Effekten führt. Das Zusammenspiel und die gegenseitige Verstärkung einzelner Effekte werden dargestellt und moderierende und mediierende Faktoren aufgezeigt. Auf diese Weise ist es möglich, eine Verbindung zwischen kurzfristigen Effekten einzelner Unterbrechungen und Gesundheitsbeeinträchtigungen durch die Arbeitsbedingung ‚Unterbrechungen‘rnherzustellen.rnrnStudie 1: In dieser Studie wurde untersucht, inwieweit Unterbrechungen Leistung und Wohlbefinden einer Person innerhalb eines Arbeitstages beeinflussen. Es wurde postuliert, dass das Auftreten von Unterbrechungen die Zufriedenheit mit der eigenen Leistung vermindert und das Vergessen von Intentionen und das Irritationserleben verstärkt. Geistige Anforderung und Zeitdruck galten hierbei als Mediatoren. Um dies zu testen, wurden 133 Pflegekräften über 5 Tage hinweg mittels Smartphones befragt. Mehrebenenanalysen konnten die Haupteffekte bestätigen. Die vermuteten Mediationseffekte wurden für Irritation und (teilweise) für Zufriedenheit mit der Leistung bestätigt, nicht jedoch für Vergessen von Intentionen. Unterbrechungen führen demzufolge (u.a.) zu negativen Effekten, da sie kognitiv anspruchsvoll sind und Zeit beanspruchen.rnrnStudie 2: In dieser Studie wurden Zusammenhänge zwischen kognitiven Stressorenrn(Arbeitsunterbrechungen und Multitasking) und Beanspruchungsfolgen (Stimmung und Irritation) innerhalb eines Arbeitstages gemessen. Es wurde angenommen, dass diese Zusammenhänge durch chronologisches Alter und Indikatoren funktionalen Alters (Arbeitsgedächtniskapazität und Aufmerksamkeit) moderiert wird. Ältere mit schlechteren Aufmerksamkeitsund Arbeitsgedächtnisleistungen sollten am stärksten durch die untersuchten Stressoren beeinträchtigt werden. Es wurde eine Tagebuchstudie (siehe Studie 1) und computergestützternkognitive Leistungstests durchgeführt. Mehrebenenanalysen konnten die Haupteffekte für die abhängigen Variablen Stimmung (Valenz und Wachheit) und Irritation bestätigen, nicht jedoch für Erregung (Stimmung). Dreifachinteraktionen wurden nicht in der postulierten Richtung gefunden. Jüngere, nicht Ältere profitierten von einem hohen basalen kognitivenrnLeistungsvermögen. Ältere scheinen Copingstrategien zu besitzen, die mögliche kognitive Verluste ausgleichen. rnrnIm Allgemeinen konnten die (getesteten) Annahmen des theoretischen Rahmenmodellsrnbestätigt werden. Prinzipiell scheint es möglich, Ergebnisse der Laborforschung auf die Feldforschung zu übertragen, jedoch ist es notwendig die Besonderheiten des Feldes zu berücksichtigen. Die postulieren Mediationseffekte (Studie 1) wurden (teilweise) bestätigt. Die Ergebnisse weisen jedoch darauf hin, dass der volle Arbeitstag untersucht werden muss und dass sehr spezifische abhängige Variablen auch spezifischere Mediatoren benötigen. Des Weiteren konnte in Studie 2 bestätigt werden, dass die kognitive Kapazität eine bedeutsamernRessource im Umgang mit Unterbrechungen ist, im Arbeitskontext jedoch auch andere Ressourcen wirken.