967 resultados para Portlet-based application
Resumo:
Die Bedeutung des Dienstgüte-Managements (SLM) im Bereich von Unternehmensanwendungen steigt mit der zunehmenden Kritikalität von IT-gestützten Prozessen für den Erfolg einzelner Unternehmen. Traditionell werden zur Implementierung eines wirksamen SLMs Monitoringprozesse in hierarchischen Managementumgebungen etabliert, die einen Administrator bei der notwendigen Rekonfiguration von Systemen unterstützen. Auf aktuelle, hochdynamische Softwarearchitekturen sind diese hierarchischen Ansätze jedoch nur sehr eingeschränkt anwendbar. Ein Beispiel dafür sind dienstorientierte Architekturen (SOA), bei denen die Geschäftsfunktionalität durch das Zusammenspiel einzelner, voneinander unabhängiger Dienste auf Basis deskriptiver Workflow-Beschreibungen modelliert wird. Dadurch ergibt sich eine hohe Laufzeitdynamik der gesamten Architektur. Für das SLM ist insbesondere die dezentrale Struktur einer SOA mit unterschiedlichen administrativen Zuständigkeiten für einzelne Teilsysteme problematisch, da regelnde Eingriffe zum einen durch die Kapselung der Implementierung einzelner Dienste und zum anderen durch das Fehlen einer zentralen Kontrollinstanz nur sehr eingeschränkt möglich sind. Die vorliegende Arbeit definiert die Architektur eines SLM-Systems für SOA-Umgebungen, in dem autonome Management-Komponenten kooperieren, um übergeordnete Dienstgüteziele zu erfüllen: Mithilfe von Selbst-Management-Technologien wird zunächst eine Automatisierung des Dienstgüte-Managements auf Ebene einzelner Dienste erreicht. Die autonomen Management-Komponenten dieser Dienste können dann mithilfe von Selbstorganisationsmechanismen übergreifende Ziele zur Optimierung von Dienstgüteverhalten und Ressourcennutzung verfolgen. Für das SLM auf Ebene von SOA Workflows müssen temporär dienstübergreifende Kooperationen zur Erfüllung von Dienstgüteanforderungen etabliert werden, die sich damit auch über mehrere administrative Domänen erstrecken können. Eine solche zeitlich begrenzte Kooperation autonomer Teilsysteme kann sinnvoll nur dezentral erfolgen, da die jeweiligen Kooperationspartner im Vorfeld nicht bekannt sind und – je nach Lebensdauer einzelner Workflows – zur Laufzeit beteiligte Komponenten ausgetauscht werden können. In der Arbeit wird ein Verfahren zur Koordination autonomer Management-Komponenten mit dem Ziel der Optimierung von Antwortzeiten auf Workflow-Ebene entwickelt: Management-Komponenten können durch Übertragung von Antwortzeitanteilen untereinander ihre individuellen Ziele straffen oder lockern, ohne dass das Gesamtantwortzeitziel dadurch verändert wird. Die Übertragung von Antwortzeitanteilen wird mithilfe eines Auktionsverfahrens realisiert. Technische Grundlage der Kooperation bildet ein Gruppenkommunikationsmechanismus. Weiterhin werden in Bezug auf die Nutzung geteilter, virtualisierter Ressourcen konkurrierende Dienste entsprechend geschäftlicher Ziele priorisiert. Im Rahmen der praktischen Umsetzung wird die Realisierung zentraler Architekturelemente und der entwickelten Verfahren zur Selbstorganisation beispielhaft für das SLM konkreter Komponenten vorgestellt. Zur Untersuchung der Management-Kooperation in größeren Szenarien wird ein hybrider Simulationsansatz verwendet. Im Rahmen der Evaluation werden Untersuchungen zur Skalierbarkeit des Ansatzes durchgeführt. Schwerpunkt ist hierbei die Betrachtung eines Systems aus kooperierenden Management-Komponenten, insbesondere im Hinblick auf den Kommunikationsaufwand. Die Evaluation zeigt, dass ein dienstübergreifendes, autonomes Performance-Management in SOA-Umgebungen möglich ist. Die Ergebnisse legen nahe, dass der entwickelte Ansatz auch in großen Umgebungen erfolgreich angewendet werden kann.
Resumo:
Vorgestellt wird eine weltweit neue Methode, Schnittstellen zwischen Menschen und Maschinen für individuelle Bediener anzupassen. Durch Anwenden von Abstraktionen evolutionärer Mechanismen wie Selektion, Rekombination und Mutation in der EOGUI-Methodik (Evolutionary Optimization of Graphical User Interfaces) kann eine rechnergestützte Umsetzung der Methode für Graphische Bedienoberflächen, insbesondere für industrielle Prozesse, bereitgestellt werden. In die Evolutionäre Optimierung fließen sowohl die objektiven, d.h. messbaren Größen wie Auswahlhäufigkeiten und -zeiten, mit ein, als auch das anhand von Online-Fragebögen erfasste subjektive Empfinden der Bediener. Auf diese Weise wird die Visualisierung von Systemen den Bedürfnissen und Präferenzen einzelner Bedienern angepasst. Im Rahmen dieser Arbeit kann der Bediener aus vier Bedienoberflächen unterschiedlicher Abstraktionsgrade für den Beispielprozess MIPS ( MIschungsProzess-Simulation) die Objekte auswählen, die ihn bei der Prozessführung am besten unterstützen. Über den EOGUI-Algorithmus werden diese Objekte ausgewählt, ggf. verändert und in einer neuen, dem Bediener angepassten graphischen Bedienoberfläche zusammengefasst. Unter Verwendung des MIPS-Prozesses wurden Experimente mit der EOGUI-Methodik durchgeführt, um die Anwendbarkeit, Akzeptanz und Wirksamkeit der Methode für die Führung industrieller Prozesse zu überprüfen. Anhand der Untersuchungen kann zu großen Teilen gezeigt werden, dass die entwickelte Methodik zur Evolutionären Optimierung von Mensch-Maschine-Schnittstellen industrielle Prozessvisualisierungen tatsächlich an den einzelnen Bediener anpaßt und die Prozessführung verbessert.
Resumo:
The present study introduces a multi-agent architecture designed for doing automation process of data integration and intelligent data analysis. Different from other approaches the multi-agent architecture was designed using a multi-agent based methodology. Tropos, an agent based methodology was used for design. Based on the proposed architecture, we describe a Web based application where the agents are responsible to analyse petroleum well drilling data to identify possible abnormalities occurrence. The intelligent data analysis methods used was the Neural Network.
Resumo:
[ES] El Trabajo Final de Grado tiene por finalidad ofrecer una solución que ayude a las personas a gestionar sus tareas tanto personales como empresariales de una manera más productiva. Actualmente este tipo de aplicaciones tienen mucho éxito. Se decidió que el desarrollo de esta aplicación fuera con la metodología Getting Things Done (GTD) ya que es una metodología que aumenta la productividad y reduce el estrés laboral. A día de hoy, no hay muchas aplicaciones que utilice esta metodología y las que la utilizan lo hace de una forma muy básica. Junto a esta metodología y guiándonos de la experiencia del tutor se intentó combinar esta metodología con controles de tiempo para mejorar aún más la productividad de las personas que utiliza dicho software. El resultado obtenido de este trabajo final de grado fue la base de una aplicación web para la gestión de tareas. El software creado es totalmente funcional, muy fácil de usar, muy intuitivo, y usa la filosofía Getting Things Done . Básicamente los objetivos principales conseguidos en este proyecto fueron: la gestión de usuarios. La gestión de tareas y proyectos. Aplicación de la metodología GTD. Control del tiempo productivo, e improductivo, interrupciones, temporizadores. La aplicación ha sido realizada como Trabajo Final de Grado en Ingeniería Informática, cumpliendo con todas las fases del desarrollo del software, para obtener un producto funcional que fuera aprobado por el tutor que haría el rol de potencial cliente. En el presente proyecto se ha seguido la metodología RUP, dirigida por casos de uso, iterativa e incremental. Para completar el proceso se ha realizado la elaboración de una lista de características, la especificación de los casos de uso, una fase de análisis, una de diseño, implementación y prueba. Las tecnologías utilizadas han sido, principalmente, Ruby On Rails, HTML5, CSS , AJAX y JAVASCRIPT. El objetivo a largo plazo es que esta solución pueda ser tomada como base de implementación, donde haciendo las mejoras necesarias se pueda poner en el mercado un gran software de gestión de tareas siguiendo la metodología GTD.
Resumo:
This thesis deals with Context Aware Services, Smart Environments, Context Management and solutions for Devices and Service Interoperability. Multi-vendor devices offer an increasing number of services and end-user applications that base their value on the ability to exploit the information originating from the surrounding environment by means of an increasing number of embedded sensors, e.g. GPS, compass, RFID readers, cameras and so on. However, usually such devices are not able to exchange information because of the lack of a shared data storage and common information exchange methods. A large number of standards and domain specific building blocks are available and are heavily used in today's products. However, the use of these solutions based on ready-to-use modules is not without problems. The integration and cooperation of different kinds of modules can be daunting because of growing complexity and dependency. In this scenarios it might be interesting to have an infrastructure that makes the coexistence of multi-vendor devices easy, while enabling low cost development and smooth access to services. This sort of technologies glue should reduce both software and hardware integration costs by removing the trouble of interoperability. The result should also lead to faster and simplified design, development and, deployment of cross-domain applications. This thesis is mainly focused on SW architectures supporting context aware service providers especially on the following subjects: - user preferences service adaptation - context management - content management - information interoperability - multivendor device interoperability - communication and connectivity interoperability Experimental activities were carried out in several domains including Cultural Heritage, indoor and personal smart spaces – all of which are considered significant test-beds in Context Aware Computing. The work evolved within european and national projects: on the europen side, I carried out my research activity within EPOCH, the FP6 Network of Excellence on “Processing Open Cultural Heritage” and within SOFIA, a project of the ARTEMIS JU on embedded systems. I worked in cooperation with several international establishments, including the University of Kent, VTT (the Technical Reserarch Center of Finland) and Eurotech. On the national side I contributed to a one-to-one research contract between ARCES and Telecom Italia. The first part of the thesis is focused on problem statement and related work and addresses interoperability issues and related architecture components. The second part is focused on specific architectures and frameworks: - MobiComp: a context management framework that I used in cultural heritage applications - CAB: a context, preference and profile based application broker which I designed within EPOCH Network of Excellence - M3: "Semantic Web based" information sharing infrastructure for smart spaces designed by Nokia within the European project SOFIA - NoTa: a service and transport independent connectivity framework - OSGi: the well known Java based service support framework The final section is dedicated to the middleware, the tools and, the SW agents developed during my Doctorate time to support context-aware services in smart environments.
Resumo:
This research focuses on the definition of the complex relationship that exists between theory and project, which - in the architectural work by Oswald Mathias Ungers - is based on several essays and on the publications that - though they have never been collected in an organic text - make up an articulated corpus, so that it is possible to consider it as the foundations of a theory. More specifically, this thesis deals with the role of metaphor in Unger’s theory and its subsequent practical application to his projects. The path leading from theoretical analysis to architectural project is in Ungers’ view a slow and mediated path, where theory is an instrument without which it would not be possible to create the project's foundations. The metaphor is a figure of speech taken from disciplines such as philosophy, aesthetics, linguistics. Using a metaphor implies a transfer of meaning, as it is essentially based on the replacement of a real object with a figurative one. The research is articulated in three parts, each of them corresponding to a text by Ungers that is considered as crucial to understand the development of his architectural thinking. Each text marks three decades of Ungers’ work: the sixties, seventies and eighties. The first part of the research deals with the topic of Großform expressed by Ungers in his publication of 1966 Grossformen im Wohnungsbau, where he defines four criteria based on which architecture identifies with a Großform. One of the hypothesis underlying this study is that there is a relationship between the notion of Großform and the figure of metaphor. The second part of the thesis analyzes the time between the end of the sixties and the seventies, i.e. the time during which Ungers lived in the USA and taught at the Cornell University of Ithaca. The analysis focuses on the text Entwerfen und Denken in Vorstellungen, Metaphern und Analogien, written by Ungers in 1976, for the exhibition MAN transFORMS organized in the Cooper - Hewitt Museum in New York. This text, through which Ungers creates a sort of vocabulary to explain the notions of metaphor, analogy, signs, symbols and allegories, can be defined as the Manifesto of his architectural theory, the latter being strictly intertwined with the metaphor as a design instrument and which is best expressed when he introduces the 11 thesis with P. Koolhaas, P. Riemann, H. Kollhoff and A. Ovaska in Die Stadt in der Stadt in 1977. Berlin das grüne Stadtarchipel. The third part analyzes the indissoluble tie between the use of metaphor and the choice of the topic on which the project is based and, starting from Ungers’ publication in 1982 Architecture as theme, the relationship between idea/theme and image/metaphor is explained. Playing with shapes requires metaphoric thinking, i.e. taking references to create new ideas from the world of shapes and not just from architecture. The metaphor as a tool to interpret reality becomes for Ungers an inquiry method that precedes a project and makes it possible to define the theme on which the project will be based. In Ungers’ case, the architecture of ideas matches the idea of architecture; for Ungers the notions of idea and theme, image and metaphor cannot be separated from each other, the text on thematization of architecture is not a report of his projects, but it represents the need to put them in order and highlight the theme on which they are based.
Resumo:
La neuroriabilitazione è un processo attraverso cui individui affetti da patologie neurologiche mirano al conseguimento di un recupero completo o alla realizzazione del loro potenziale ottimale benessere fisico, mentale e sociale. Elementi essenziali per una riabilitazione efficace sono: una valutazione clinica da parte di un team multidisciplinare, un programma riabilitativo mirato e la valutazione dei risultati conseguiti mediante misure scientifiche e clinicamente appropriate. Obiettivo principale di questa tesi è stato sviluppare metodi e strumenti quantitativi per il trattamento e la valutazione motoria di pazienti neurologici. I trattamenti riabilitativi convenzionali richiedono a pazienti neurologici l’esecuzione di esercizi ripetitivi, diminuendo la loro motivazione. La realtà virtuale e i feedback sono in grado di coinvolgerli nel trattamento, permettendo ripetibilità e standardizzazione dei protocolli. È stato sviluppato e valutato uno strumento basato su feedback aumentati per il controllo del tronco. Inoltre, la realtà virtuale permette l’individualizzare il trattamento in base alle esigenze del paziente. Un’applicazione virtuale per la riabilitazione del cammino è stata sviluppata e testata durante un training su pazienti di sclerosi multipla, valutandone fattibilità e accettazione e dimostrando l'efficacia del trattamento. La valutazione quantitativa delle capacità motorie dei pazienti viene effettuata utilizzando sistemi di motion capture. Essendo il loro uso nella pratica clinica limitato, una metodologia per valutare l’oscillazione delle braccia in soggetti parkinsoniani basata su sensori inerziali è stata proposta. Questi sono piccoli, accurati e flessibili ma accumulano errori durante lunghe misurazioni. È stato affrontato questo problema e i risultati suggeriscono che, se il sensore è sul piede e le accelerazioni sono integrate iniziando dalla fase di mid stance, l’errore e le sue conseguenze nella determinazione dei parametri spaziali sono contenuti. Infine, è stata presentata una validazione del Kinect per il tracking del cammino in ambiente virtuale. Risultati preliminari consentono di definire il campo di utilizzo del sensore in riabilitazione.
Resumo:
ticketing web based
Resumo:
A range of societal issues have been caused by fossil fuel consumption in the transportation sector in the United States (U.S.), including health related air pollution, climate change, the dependence on imported oil, and other oil related national security concerns. Biofuels production from various lignocellulosic biomass types such as wood, forest residues, and agriculture residues have the potential to replace a substantial portion of the total fossil fuel consumption. This research focuses on locating biofuel facilities and designing the biofuel supply chain to minimize the overall cost. For this purpose an integrated methodology was proposed by combining the GIS technology with simulation and optimization modeling methods. The GIS based methodology was used as a precursor for selecting biofuel facility locations by employing a series of decision factors. The resulted candidate sites for biofuel production served as inputs for simulation and optimization modeling. As a precursor to simulation or optimization modeling, the GIS-based methodology was used to preselect potential biofuel facility locations for biofuel production from forest biomass. Candidate locations were selected based on a set of evaluation criteria, including: county boundaries, a railroad transportation network, a state/federal road transportation network, water body (rivers, lakes, etc.) dispersion, city and village dispersion, a population census, biomass production, and no co-location with co-fired power plants. The simulation and optimization models were built around key supply activities including biomass harvesting/forwarding, transportation and storage. The built onsite storage served for spring breakup period where road restrictions were in place and truck transportation on certain roads was limited. Both models were evaluated using multiple performance indicators, including cost (consisting of the delivered feedstock cost, and inventory holding cost), energy consumption, and GHG emissions. The impact of energy consumption and GHG emissions were expressed in monetary terms to keep consistent with cost. Compared with the optimization model, the simulation model represents a more dynamic look at a 20-year operation by considering the impacts associated with building inventory at the biorefinery to address the limited availability of biomass feedstock during the spring breakup period. The number of trucks required per day was estimated and the inventory level all year around was tracked. Through the exchange of information across different procedures (harvesting, transportation, and biomass feedstock processing procedures), a smooth flow of biomass from harvesting areas to a biofuel facility was implemented. The optimization model was developed to address issues related to locating multiple biofuel facilities simultaneously. The size of the potential biofuel facility is set up with an upper bound of 50 MGY and a lower bound of 30 MGY. The optimization model is a static, Mathematical Programming Language (MPL)-based application which allows for sensitivity analysis by changing inputs to evaluate different scenarios. It was found that annual biofuel demand and biomass availability impacts the optimal results of biofuel facility locations and sizes.
Resumo:
The fuzzy online reputation analysis framework, or “foRa” (plural of forum, the Latin word for marketplace) framework, is a method for searching the Social Web to find meaningful information about reputation. Based on an automatic, fuzzy-built ontology, this framework queries the social marketplaces of the Web for reputation, combines the retrieved results, and generates navigable Topic Maps. Using these interactive maps, communications operatives can zero in on precisely what they are looking for and discover unforeseen relationships between topics and tags. Thus, using this framework, it is possible to scan the Social Web for a name, product, brand, or combination thereof and determine query-related topic classes with related terms and thus identify hidden sources. This chapter also briefly describes the youReputation prototype (www.youreputation.org), a free web-based application for reputation analysis. In the course of this, a small example will explain the benefits of the prototype.
Resumo:
BACKGROUND The majority of radiological reports are lacking a standard structure. Even within a specialized area of radiology, each report has its individual structure with regards to details and order, often containing too much of non-relevant information the referring physician is not interested in. For gathering relevant clinical key parameters in an efficient way or to support long-term therapy monitoring, structured reporting might be advantageous. OBJECTIVE Despite of new technologies in medical information systems, medical reporting is still not dynamic. To improve the quality of communication in radiology reports, a new structured reporting system was developed for abdominal aortic aneurysms (AAA), intended to enhance professional communication by providing the pertinent clinical information in a predefined standard. METHODS Actual state analysis was performed within the departments of radiology and vascular surgery by developing a Technology Acceptance Model. The SWOT (strengths, weaknesses, opportunities, and threats) analysis focused on optimization of the radiology reporting of patients with AAA. Definition of clinical parameters was achieved by interviewing experienced clinicians in radiology and vascular surgery. For evaluation, a focus group (4 radiologists) looked at the reports of 16 patients. The usability and reliability of the method was validated in a real-world test environment in the field of radiology. RESULTS A Web-based application for radiological "structured reporting" (SR) was successfully standardized for AAA. Its organization comprises three main categories: characteristics of pathology and adjacent anatomy, measurements, and additional findings. Using different graphical widgets (eg, drop-down menus) in each category facilitate predefined data entries. Measurement parameters shown in a diagram can be defined for clinical monitoring and be adducted for quick adjudications. Figures for optional use to guide and standardize the reporting are embedded. Analysis of variance shows decreased average time required with SR to obtain a radiological report compared to free-text reporting (P=.0001). Questionnaire responses confirm a high acceptance rate by the user. CONCLUSIONS The new SR system may support efficient radiological reporting for initial diagnosis and follow-up for AAA. Perceived advantages of our SR platform are ease of use, which may lead to more accurate decision support. The new system is open to communicate not only with clinical partners but also with Radiology Information and Hospital Information Systems.
Resumo:
Introduction Since the quality of patient portrayal of standardized patients (SPs) during an Objective Structured Clinical Exam (OSCE) has a major impact on the reliability and validity of the exam, quality control should be initiated. Literature about quality control of SP’s performance focuses on feedback [1, 2] or completion of checklists [3, 4]. Since we did not find a published instrument meeting our needs for the assessment of patient portrayal, we developed such an instrument after being inspired by others [5] and used it in our high-stakes exam. Methods SP trainers from all five Swiss medical faculties collected and prioritized quality criteria for patient portrayal. Items were revised with the partners twice, based on experiences during OSCEs. The final instrument contains 14 criteria for acting (i.e. adequate verbal and non-verbal expression) and standardization (i.e. verbatim delivery of the first sentence). All partners used the instrument during a high-stakes OSCE. Both, SPs and trainers were introduced to the instrument. The tool was used in training (more than 100 observations) and during the exam (more than 250 observations). FAIR_OSCE The list of items to assess the quality of the simulation by SPs was primarily developed and used to provide formative feedback to the SPs in order to help them to improve their performance. It was therefore named “Feedbackstruckture for the Assessment of Interactive Role play in Objective Structured Clinical Exams (FAIR_OSCE). It was also used to assess the quality of patient portrayal during the exam. The results were calculated for each of the five faculties individually. Formative evaluation was given to the five faculties with individual feedback without revealing results of other faculties other than overall results. Results High quality of patient portrayal during the exam was documented. More than 90% of SP performances were rated to be completely correct or sufficient. An increase in quality of performance between training and exam was noted. In example the rate of completely correct reaction in medical tests increased from 88% to 95%. 95% completely correct reactions together with 4% sufficient reactions add up to 99% of the reactions meeting the requirements of the exam. SP educators using the instrument reported an augmentation of SPs performance induced by the use of the instrument. Disadvantages mentioned were high concentration needed to explicitly observe all criteria and cumbersome handling of the paper-based forms. Conclusion We were able to document a very high quality of SP performance in our exam. The data also indicate that our training is effective. We believe that the high concentration needed using the instrument is well invested, considering the observed augmentation of performance. The development of an iPad based application for the form is planned to address the cumbersome handling of the paper.
Resumo:
Introduction Since the quality of patient portrayal of standardized patients (SPs) during an Objective Structured Clinical Exam (OSCE) has a major impact on the reliability and validity of the exam, quality control should be initiated. Literature about quality control of SPs’ performance focuses on feedback [1, 2] or completion of checklists [3, 4]. Since we did not find a published instrument meeting our needs for the assessment of patient portrayal, we developed such an instrument after being inspired by others [5] and used it in our high-stakes exam. Project description SP trainers from five medical faculties collected and prioritized quality criteria for patient portrayal. Items were revised twice, based on experiences during OSCEs. The final instrument contains 14 criteria for acting (i.e. adequate verbal and non-verbal expression) and standardization (i.e. verbatim delivery of the first sentence). All partners used the instrument during a high-stakes OSCE. SPs and trainers were introduced to the instrument. The tool was used in training (more than 100 observations) and during the exam (more than 250 observations). Outcome High quality of SPs’ patient portrayal during the exam was documented. More than 90% of SP performances were rated to be completely correct or sufficient. An increase in quality of performance between training and exam was noted. For example, the rate of completely correct reaction in medical tests increased from 88% to 95%. Together with 4% of sufficient performances these 95% add up to 99% of the reactions in medical tests meeting the standards of the exam. SP educators using the instrument reported an augmentation of SPs’ performance induced by the use of the instrument. Disadvantages mentioned were the high concentration needed to observe all criteria and the cumbersome handling of the paper-based forms. Discussion We were able to document a very high quality of SP performance in our exam. The data also indicates that our training is effective. We believe that the high concentration needed using the instrument is well invested, considering the observed enhancement of performance. The development of an iPad-based application for the form is planned to address the cumbersome handling of the paper.
Resumo:
Voting power is commonly measured using a probability. But what kind of probability is this? Is it a degree of belief or an objective chance or some other sort of probability? The aim of this paper is to answer this question. The answer depends on the use to which a measure of voting power is put. Some objectivist interpretations of probabilities are appropriate when we employ such a measure for descriptive purposes. By contrast, when voting power is used to normatively assess voting rules, the probabilities are best understood as classical probabilities, which count possibilities. This is so because, from a normative stance, voting power is most plausibly taken to concern rights and thus possibilities. The classical interpretation also underwrites the use of the Bernoulli model upon which the Penrose/Banzhaf measure is based.
Resumo:
We determined changes in equatorial Pacific phosphorus (µmol P/g) and barite (BaSO4; wt%) concentrations at high resolution (2 cm) across the Paleocene/Eocene (P/E) boundary in sediments from Ocean Drilling Program (ODP) Leg 199 Site 1221 (153.40 to 154.80 meters below seafloor [mbsf]). Oxide-associated, authigenic, and organic P sequentially extracted from bulk sediment were used to distinguish reactive P from detrital P. We separated barite from bulk sediment and compared its morphology with that of modern unaltered biogenic barite to check for diagenesis. On a CaCO3-free basis, reactive P concentrations are relatively constant and high (323 µmol P/g or ~1 wt%). Barite concentrations range from 0.05 to 5.6 wt%, calculated on a CaCO3-free basis, and show significant variability over this time interval. Shipboard measurements of P and Ba in bulk sediments are systematically lower (by ~25%) than shore-based concentrations and likely indicate problems with shipboard standard calibrations. The presence of Mn oxides and the size, crystal morphology, and sulfur isotopes of barite imply deposition in sulfate-rich pore fluids. Relatively constant reactive P, organic C, and biogenic silica concentrations calculated on a CaCO3-free basis indicate generally little variation in organic C, reactive P, and biogenic opal burial across the P/E boundary, whereas variable barite concentrations indicate significant changes in export productivity. Low barite Ba/reactive P ratios before and immediately after the Benthic Extinction Event (BEE) may indicate efficient nutrient burial, and, if nutrient burial and organic C burial are linked, high relative organic C burial that could temporarily drawdown CO2 at this site. This interpretation requires postdepositional oxidation of organic C because organic C to reactive P ratios are low throughout the section. After the BEE, higher barite Ba/reactive P ratios combined with higher barite Ba concentrations may imply that higher export productivity was coupled with unchanged reactive P burial, indicating efficient nutrient and possibly also organic C recycling in the water column. If the nutrient recycling is decoupled from organic C, the high export production could be indicative of drawdown of CO2. However, the observation that organic C burial is not high where barite burial is high may imply that either C sequestration was restricted to the deep ocean and thus occurred only on timescales of the deep ocean mixing or that postdepositional oxidation (burn down) of organic matter affected the sediments. The decoupling of barite and opal may result from low opal preservation or production that is not diatom based.