953 resultados para Key-term separation principle
Resumo:
Az EU-ban, a mai állapotok szerint, csak „transzferunióról” beszélhetünk és nem egységes piacról. Az eurós pénzfolyamatok eltorzítva közvetítik a versenyképességet is: mind az árukban és vagyontárgyakban, mind – főleg – a pénzügyi eszközökben megtestesült munkákat/teljesítményeket rosszul árazzák. Egy ilyen keretben különösen könnyen alakul ki az, amit potyautas-problémának nevezünk, vagyis ahol tényleges vagy mérhető teljesítményleadás, vagy éppen fizetés nélkül lehet fogyasztani, és túl olcsón lehet szabad forrásokhoz jutni. Az eurózóna számos közvetítő mechanizmusában is tökéletlen. A sok, szuverénadósság-présbe került tagország között van kicsi, közepes és nagy is. Ez a tény, valamint az általános növekedési és munkapiaci problémák, egyértelműen „rendszerszintű zavarokat” jeleznek, amelyeket ebben a dolgozatban teljesítmény közvetítési-átviteli problémának hívunk, és ezért egy szokatlan, ám annál beszédesebb, elektromosenergia-átviteli rendszeranalógiával segítünk értelmezni. Megmutatjuk, hogy egy jó nagyvállalat miért jobb pénzügyi tervező, mint egy azonos méretű állam. _____ Why are ill-defined transfer mechanisms diverting valuable assets and resources to the wrong destination within the EU? Why do we witness ongoing pressure in the EU banking sector and in government finances? We offer an unusual answer to these questions: we apply an analogy from physics (from an electric generation and distribution network) to show the transmission inefficiency and waste, respectively, of the EU distribution mechanisms. We demonstrate that there are inherent flaws in both the measurement and in the distribution of assets and resources amongst the key EU markets: goods, money and factor markets. In addition, we find that when international equalizer mechanism is at work (cohesion funds allocated), many of these equity functions are at risk with respect to their reliable measurement. Especially are at risk the metered load factors, likewise the loss/waste factors. The map of desired outcomes does not match the real outcome, since EUtransfers in general are put to work with low efficiency.
Resumo:
Arctic soils store close to 14% of the global soil carbon. Most of arctic carbon is stored below ground in the permafrost. With climate warming the decomposition of the soil carbon could represent a significant positive feedback to global greenhouse warming. Recent evidence has shown that the temperature of the Arctic is already increasing, and this change is associated mostly with anthropogenic activities. Warmer soils will contribute to permafrost degradation and accelerate organic matter decay and thus increase the flux of carbon dioxide and methane into the atmosphere. Temperature and water availability are also important drivers of ecosystem performance, but effects can be complex and in opposition. Temperature and moisture changes can affect ecosystem respiration (ER) and gross primary productivity (GPP) independently; an increase in the net ecosystem exchange can be a result of either a decrease in ER or an increase in GPP. Therefore, understanding the effects of changes in ecosystem water and temperature on the carbon flux components becomes key to predicting the responses of the Arctic to climate change. The overall goal of this work was to determine the response of arctic systems to simulated climate change scenarios with simultaneous changes in temperature and moisture. A temperature and hydrological manipulation in a naturally-drained lakebed was used to assess the short-term effect of changes in water and temperature on the carbon cycle. Also, as part of International Tundra Experiment Network (ITEX), I determined the long-term effect of warming on the carbon cycle in a natural hydrological gradient established in the mid 90's. I found that the carbon balance is highly sensitive to short-term changes in water table and warming. However, over longer time periods, hydrological and temperature changed soil biophysical properties, nutrient cycles, and other ecosystem structural and functional components that down regulated GPP and ER, especially in wet areas.
Resumo:
This paper synthesizes research conducted during the first 5–6 years of the Florida Coastal Everglades Long-Term Ecological Research Program (FCE LTER). My objectives are to review our research to date, and to present a new central theme and conceptual approach for future research. Our research has focused on understanding how dissolved organic matter (DOM) from upstream oligotrophic marshes interacted with a marine source of the limiting nutrient, phosphorus (P), to control productivity in the oligohaline estuarine ecotone. We have been working along freshwater to marine transects in two drainage basins located in Everglades National Park (ENP). The Shark River Slough transect (SRS) has a direct connection to the Gulf of Mexico, providing this estuarine ecotone with a source of marine P. The oligohaline ecotone along our southern Everglades transect (TS/Ph), however, is separated from this marine P source by the Florida Bay estuary. We originally hypothesized an ecosystem productivity peak in the SRS ecotone, driven by the interaction of marine P and Everglades DOM, but no such productivity peak in the TS/Ph ecotone because of this lack of marine P. Our research to date has tended to show the opposite pattern, however, with many ecosystem components showing enhanced productivity in the TS/Ph ecotone, but not in the SRS ecotone. Water column P concentrations followed a similar pattern, with unexpectedly high P in the TS/Ph ecotone during the dry season. Our organic geochemical research has shown that Everglades DOM is more refractory than originally hypothesized. We have also begun to understand the importance of detrital organic matter production and transport to ecotone dynamics and as the base of aquatic food webs. Our future research will build on this substantial body of knowledge about these oligotrophic estuaries. We will direct our efforts more strongly on biophysical dynamics in the oligohaline ecotone regions. Specifically, we will be focusing on inputs to these regions from four primary water sources: freshwater Everglades runoff, net precipitation, marine inputs, and groundwater. We are hypothesizing that dry season groundwater inputs of P will be particularly important to TS/Ph ecotone dynamics because of longer water residence times in this area. Our organic geochemical, biogeochemical, and ecosystem energetics work will focus more strongly on the importance of detrital organics and will take advantage of a key Everglades Restoration project, scheduled for 2008 or 2009, that will increase freshwater inputs to our SRS transect only. Finally, we will also begin to investigate the human dimensions of restoration, and of a growing population in south Florida that will become increasingly dependent on the Everglades for critical ecosystem services (including fresh water) even as its growth presents challenges to Everglades sustainability.
Resumo:
In this article, I offer an institutional history of the ecosystem concept, tracing shifts in its meaning and application as it has become the key organizing principle for the Everglades restoration program in Florida. Two institutional forms are analyzed here: (1) quasigovernmental organizations, a term I use to describe interagency science collaboratives and community stakeholder organizations, and (2) government bureaucracies, which are the administrative agencies tasked with Everglades restoration planning and implementation. In analyzing these knowledge trajectories, I both document the complex networks of relations that facilitate the ecosystem’s emergence as an object of knowledge and examine the bureaucratic claims to authority that circumscribe the ecosystem’s transformation into policy.
Resumo:
In his dialogue - Near Term Computer Management Strategy For Hospitality Managers and Computer System Vendors - by William O'Brien, Associate Professor, School of Hospitality Management at Florida International University, Associate Professor O’Brien initially states: “The computer revolution has only just begun. Rapid improvement in hardware will continue into the foreseeable future; over the last five years it has set the stage for more significant improvements in software technology still to come. John Naisbitt's information electronics economy¹ based on the creation and distribution of information has already arrived and as computer devices improve, hospitality managers will increasingly do at least a portion of their work with software tools.” At the time of this writing Assistant Professor O’Brien will have you know, contrary to what some people might think, the computer revolution is not over, it’s just beginning; it’s just an embryo. Computer technology will only continue to develop and expand, says O’Brien with citation. “A complacent few of us who feel “we have survived the computer revolution” will miss opportunities as a new wave of technology moves through the hospitality industry,” says ‘Professor O’Brien. “Both managers who buy technology and vendors who sell it can profit from strategy based on understanding the wave of technological innovation,” is his informed opinion. Property managers who embrace rather than eschew innovation, in this case computer technology, will benefit greatly from this new science in hospitality management, O’Brien says. “The manager who is not alert to or misunderstands the nature of this wave of innovation will be the constant victim of technology,” he advises. On the vendor side of the equation, O’Brien observes, “Computer-wise hospitality managers want systems which are easier and more profitable to operate. Some view their own industry as being somewhat behind the times… They plan to pay significantly less for better computer devices. Their high expectations are fed by vendor marketing efforts…” he says. O’Brien warns against taking a gamble on a risky computer system by falling victim to un-substantiated claims and pie-in-the-sky promises. He recommends affiliating with turn-key vendors who provide hardware, software, and training, or soliciting the help of large mainstream vendors such as IBM, NCR, or Apple. Many experts agree that the computer revolution has merely and genuinely morphed into the software revolution, informs O’Brien; “…recognizing that a computer is nothing but a box in which programs run.” Yes, some of the empirical data in this article is dated by now, but the core philosophy of advancing technology, and properties continually tapping current knowledge is sound.
Resumo:
This dissertation studies the manipulation of particles using acoustic stimulation for applications in microfluidics and templating of devices. The term particle is used here to denote any solid, liquid or gaseous material that has properties, which are distinct from the fluid in which it is suspended. Manipulation means to take over the movements of the particles and to position them in specified locations. Using devices, microfabricated out of silicon, the behavior of particles under the acoustic stimulation was studied with the main purpose of aligning the particles at either low-pressure zones, known as the nodes or high-pressure zones, known as anti-nodes. By aligning particles at the nodes in a flow system, these particles can be focused at the center or walls of a microchannel in order to ultimately separate them. These separations are of high scientific importance, especially in the biomedical domain, since acoustopheresis provides a unique approach to separate based on density and compressibility, unparalleled by other techniques. The study of controlling and aligning the particles in various geometries and configurations was successfully achieved by controlling the acoustic waves. Apart from their use in flow systems, a stationary suspended-particle device was developed to provide controllable light transmittance based on acoustic stimuli. Using a glass compartment and a carbon-particle suspension in an organic solvent, the device responded to acoustic stimulation by aligning the particles. The alignment of light-absorbing carbon particles afforded an increase in visible light transmittance as high as 84.5%, and it was controlled by adjusting the frequency and amplitude of the acoustic wave. The device also demonstrated alignment memory rendering it energy-efficient. A similar device for suspended-particles in a monomer enabled the development of electrically conductive films. These films were based on networks of conductive particles. Elastomers doped with conductive metal particles were rendered surface conductive at particle loadings as low as 1% by weight using acoustic focusing. The resulting films were flexible and had transparencies exceeding 80% in the visible spectrum (400-800 nm) These films had electrical bulk conductivities exceeding 50 S/cm.
Resumo:
Ocean acidification, the drop in seawater pH associated with the ongoing enrichment of marine waters with carbon dioxide from fossil fuel burning, may seriously impair marine calcifying organisms. Our present understanding of the sensitivity of marine life to ocean acidification is based primarily on short-term experiments, in which organisms are exposed to increased concentrations of CO2. However, phytoplankton species with short generation times, in particular, may be able to respond to environmental alterations through adaptive evolution. Here, we examine the ability of the world's single most important calcifying organism, the coccolithophore Emiliania huxleyi, to evolve in response to ocean acidification in two 500-generation selection experiments. Specifically, we exposed E. huxleyi populations founded by single or multiple clones to increased concentrations of CO2. Around 500 asexual generations later we assessed their fitness. Compared with populations kept at ambient CO2 partial pressure, those selected at increased partial pressure exhibited higher growth rates, in both the single- and multiclone experiment, when tested under ocean acidification conditions. Calcification was partly restored: rates were lower under increased CO2 conditions in all cultures, but were up to 50% higher in adapted compared with non-adapted cultures. We suggest that contemporary evolution could help to maintain the functionality of microbial processes at the base of marine food webs in the face of global change.
Resumo:
Combinatorial designs are used for designing key predistribution schemes that are applied to wireless sensor networks in communications. This helps in building a secure channel. Private-key cryptography helps to determine a common key between a pair of nodes in sensor networks. Wireless sensor networks using key predistribution schemes have many useful applications in military and civil operations. When designs are efficiently implemented on sensor networks, blocks with unique keys will be the result. One such implementation is a transversal design which follows the principle of simple key establishment. Analysis of designs and modeling the key schemes are the subjects of this project.
Resumo:
Abstract
The goal of modern radiotherapy is to precisely deliver a prescribed radiation dose to delineated target volumes that contain a significant amount of tumor cells while sparing the surrounding healthy tissues/organs. Precise delineation of treatment and avoidance volumes is the key for the precision radiation therapy. In recent years, considerable clinical and research efforts have been devoted to integrate MRI into radiotherapy workflow motivated by the superior soft tissue contrast and functional imaging possibility. Dynamic contrast-enhanced MRI (DCE-MRI) is a noninvasive technique that measures properties of tissue microvasculature. Its sensitivity to radiation-induced vascular pharmacokinetic (PK) changes has been preliminary demonstrated. In spite of its great potential, two major challenges have limited DCE-MRI’s clinical application in radiotherapy assessment: the technical limitations of accurate DCE-MRI imaging implementation and the need of novel DCE-MRI data analysis methods for richer functional heterogeneity information.
This study aims at improving current DCE-MRI techniques and developing new DCE-MRI analysis methods for particular radiotherapy assessment. Thus, the study is naturally divided into two parts. The first part focuses on DCE-MRI temporal resolution as one of the key DCE-MRI technical factors, and some improvements regarding DCE-MRI temporal resolution are proposed; the second part explores the potential value of image heterogeneity analysis and multiple PK model combination for therapeutic response assessment, and several novel DCE-MRI data analysis methods are developed.
I. Improvement of DCE-MRI temporal resolution. First, the feasibility of improving DCE-MRI temporal resolution via image undersampling was studied. Specifically, a novel MR image iterative reconstruction algorithm was studied for DCE-MRI reconstruction. This algorithm was built on the recently developed compress sensing (CS) theory. By utilizing a limited k-space acquisition with shorter imaging time, images can be reconstructed in an iterative fashion under the regularization of a newly proposed total generalized variation (TGV) penalty term. In the retrospective study of brain radiosurgery patient DCE-MRI scans under IRB-approval, the clinically obtained image data was selected as reference data, and the simulated accelerated k-space acquisition was generated via undersampling the reference image full k-space with designed sampling grids. Two undersampling strategies were proposed: 1) a radial multi-ray grid with a special angular distribution was adopted to sample each slice of the full k-space; 2) a Cartesian random sampling grid series with spatiotemporal constraints from adjacent frames was adopted to sample the dynamic k-space series at a slice location. Two sets of PK parameters’ maps were generated from the undersampled data and from the fully-sampled data, respectively. Multiple quantitative measurements and statistical studies were performed to evaluate the accuracy of PK maps generated from the undersampled data in reference to the PK maps generated from the fully-sampled data. Results showed that at a simulated acceleration factor of four, PK maps could be faithfully calculated from the DCE images that were reconstructed using undersampled data, and no statistically significant differences were found between the regional PK mean values from undersampled and fully-sampled data sets. DCE-MRI acceleration using the investigated image reconstruction method has been suggested as feasible and promising.
Second, for high temporal resolution DCE-MRI, a new PK model fitting method was developed to solve PK parameters for better calculation accuracy and efficiency. This method is based on a derivative-based deformation of the commonly used Tofts PK model, which is presented as an integrative expression. This method also includes an advanced Kolmogorov-Zurbenko (KZ) filter to remove the potential noise effect in data and solve the PK parameter as a linear problem in matrix format. In the computer simulation study, PK parameters representing typical intracranial values were selected as references to simulated DCE-MRI data for different temporal resolution and different data noise level. Results showed that at both high temporal resolutions (<1s) and clinically feasible temporal resolution (~5s), this new method was able to calculate PK parameters more accurate than the current calculation methods at clinically relevant noise levels; at high temporal resolutions, the calculation efficiency of this new method was superior to current methods in an order of 102. In a retrospective of clinical brain DCE-MRI scans, the PK maps derived from the proposed method were comparable with the results from current methods. Based on these results, it can be concluded that this new method can be used for accurate and efficient PK model fitting for high temporal resolution DCE-MRI.
II. Development of DCE-MRI analysis methods for therapeutic response assessment. This part aims at methodology developments in two approaches. The first one is to develop model-free analysis method for DCE-MRI functional heterogeneity evaluation. This approach is inspired by the rationale that radiotherapy-induced functional change could be heterogeneous across the treatment area. The first effort was spent on a translational investigation of classic fractal dimension theory for DCE-MRI therapeutic response assessment. In a small-animal anti-angiogenesis drug therapy experiment, the randomly assigned treatment/control groups received multiple fraction treatments with one pre-treatment and multiple post-treatment high spatiotemporal DCE-MRI scans. In the post-treatment scan two weeks after the start, the investigated Rényi dimensions of the classic PK rate constant map demonstrated significant differences between the treatment and the control groups; when Rényi dimensions were adopted for treatment/control group classification, the achieved accuracy was higher than the accuracy from using conventional PK parameter statistics. Following this pilot work, two novel texture analysis methods were proposed. First, a new technique called Gray Level Local Power Matrix (GLLPM) was developed. It intends to solve the lack of temporal information and poor calculation efficiency of the commonly used Gray Level Co-Occurrence Matrix (GLCOM) techniques. In the same small animal experiment, the dynamic curves of Haralick texture features derived from the GLLPM had an overall better performance than the corresponding curves derived from current GLCOM techniques in treatment/control separation and classification. The second developed method is dynamic Fractal Signature Dissimilarity (FSD) analysis. Inspired by the classic fractal dimension theory, this method measures the dynamics of tumor heterogeneity during the contrast agent uptake in a quantitative fashion on DCE images. In the small animal experiment mentioned before, the selected parameters from dynamic FSD analysis showed significant differences between treatment/control groups as early as after 1 treatment fraction; in contrast, metrics from conventional PK analysis showed significant differences only after 3 treatment fractions. When using dynamic FSD parameters, the treatment/control group classification after 1st treatment fraction was improved than using conventional PK statistics. These results suggest the promising application of this novel method for capturing early therapeutic response.
The second approach of developing novel DCE-MRI methods is to combine PK information from multiple PK models. Currently, the classic Tofts model or its alternative version has been widely adopted for DCE-MRI analysis as a gold-standard approach for therapeutic response assessment. Previously, a shutter-speed (SS) model was proposed to incorporate transcytolemmal water exchange effect into contrast agent concentration quantification. In spite of richer biological assumption, its application in therapeutic response assessment is limited. It might be intriguing to combine the information from the SS model and from the classic Tofts model to explore potential new biological information for treatment assessment. The feasibility of this idea was investigated in the same small animal experiment. The SS model was compared against the Tofts model for therapeutic response assessment using PK parameter regional mean value comparison. Based on the modeled transcytolemmal water exchange rate, a biological subvolume was proposed and was automatically identified using histogram analysis. Within the biological subvolume, the PK rate constant derived from the SS model were proved to be superior to the one from Tofts model in treatment/control separation and classification. Furthermore, novel biomarkers were designed to integrate PK rate constants from these two models. When being evaluated in the biological subvolume, this biomarker was able to reflect significant treatment/control difference in both post-treatment evaluation. These results confirm the potential value of SS model as well as its combination with Tofts model for therapeutic response assessment.
In summary, this study addressed two problems of DCE-MRI application in radiotherapy assessment. In the first part, a method of accelerating DCE-MRI acquisition for better temporal resolution was investigated, and a novel PK model fitting algorithm was proposed for high temporal resolution DCE-MRI. In the second part, two model-free texture analysis methods and a multiple-model analysis method were developed for DCE-MRI therapeutic response assessment. The presented works could benefit the future DCE-MRI routine clinical application in radiotherapy assessment.
Resumo:
Quién Es, Quién Somos? Spic’ing into Existence claims a four-fold close-reading: first, analysis of texts: from theoretical meditations to (prison) memoir and film. Second, a half dozen central figures appear, largely Latinx and black American. They cut across a score of registers, socio-economics, ideological reservations, but all are, as Carl Carlton sang, poetry in motion. Writers, poets, theologians, pathologists, artists, comedians, actors, students whose vocation is invocation, the inner surge of their calling. Third, the manuscript draws from a series of historical moments—from radical liberation of the late 60s, to contemporary student activism. Finally, this body of work is movement, in all its social, gestural, and kinesthetic viscera. From this last heading, we peel away layers of what I call the ethnopoet, the fascia undoing that reveals its bio-political anatomy, dressing its bare life with kinship speech. First, the social revolutions of the Civil Rights, Black Power, abolitionism, the Black Panthers and Young Lords, boycotts and jarring artistic performances. These events are superficial not in vain sense, but key epicenters of underground murmurings, the workings of a cunning assailant. She robs not lavish estates, but another day to breathe. Gesturally, as perhaps the interlocutor, lies this author, interspersing his own diatribes to conjure her presence. The final branch is admittedly the most intangible. Kinesthetically, we map the nimbleness, footwork lígera of what I call the ethnopoet. Ethnopoet is no mere aggregate of ethnicity and poetry, but like chemical reaction, the descriptor for its behavior under certain pressures, temperatures, and elements. Elusive and resisting confinement, and therefore definition, the ethnopoet is a shapeshifting figure of how racialized bodies [people of color] respond to hegemonic powers. She is, at bottom, however, a native translator, the plural-lensed subject whose loyalty is only to the imagination of a different world, one whose survival is not contingent upon her exploitation. The native translator’s constant re-calibrations of oppressive power apparatuses seem taxing at best, and near-impossible, at worst. To effectively navigate through these polarized loci, she must identify ideologies that in turn seek “affective liberatory sances” in relation to the dominant social order (43). In a kind of performative contradiction, she must marshall the knowledge necessary to “break with ideology” while speaking within it. Chicana Studies scholar, Chela Sandoval, describes this dual movement as “meta-ideologizing”: the appropriation of hegemonic ideological forms in order to transform them (82). Nuestros padres se subieron encima de La Bestia, y por eso somos pasageros a ese tren. Y ya, dentro su pansa, tenemos que ser vigilantes cuando plantamos las bombas. In Methodology of the Oppressed, Sandoval schematizes this oppositional consciousness around five principle categories: “equal rights,” “revolutionary,” “supremacist,” “separatist,” and “differential.” Taken by themselves, the first four modes appear mutually exclusive, incapable of occupying the same plane, until a fifth pillar emerges. Cinematographic in nature, differential consciousness, as Sandoval defines it, is “a kinetic motion that maneuvers, poetically transfigures, and orchestrates while demanding alienation, perversion, and reformation in both spectators and practitioners” (44). For Sandoval, then, differential consciousness is a methodology that privileges an incredible sense mobility, one reaching artistic sensibilities. Our fourth and final analytic of movement serves an apt example of this dual meaning. Lexically speaking, ‘movement’ may be regarded as a political mobilization of aggrieved populations (through sustained efforts), or the process of moving objects (people or otherwise) from one location to another. Praxis-wise, it is both action and ideal, content and form. Thus, an ethnic poetics must be regarded less as a series of stanzas, shortened lyric, or even arrangement of language, but as a lens through which peripheralized peoples kaleidecope ideological positions in an “original, eccentric, and queer sight” (43). Taking note of the advantages of postponing identifications, the thesis stands its ground on the term ethnopoet. Its abstraction is not dewey-eyed philosophy, but an anticipation of poetic justice, of what’s to come from callused hands. This thesis is divided into 7.5 chapters. The first maps out the ethnopoet’s cartographies of struggle. By revisiting that alleged Tío Tomas, Richard Rodriguez, we unearth the tensions that negatively, deny citizenship to one silo, but on the flipside, engender manifold ways of seeing, hearing, and moving . The second, through George Jackson’s prison memoirs, pans out from this ethnography of power, groping for an apparatus that feigns an impervious prestige: ‘the aesthetic regime of coercion.’ In half-way cut, the thesis sidesteps to spic into existence, formally announcing, through Aime Cesaire, myself, and Pedro Pietri, the poeticization of trauma. Such uplift denies New Age transcendence of self, but a rehearsal of our entrapment in these mortal envelopes. Thirdly, conscious of the bleeding ethnic body, we cut open the incipient corpse to observe her pathologist. Her native autopsies offer the ethnic body’s posthumous recognition, the ethnopoetics ability to speak for and through the dead. Chapter five examines prolific black artists—Beyonce and Kendrick Lamar—to elide the circumvention of their consumption via invoking radical black hi/her-stories, ones fragmenting the black body. Sixth, the paper compares the Black Power Salute of the 1968 Mexico City Olympics to Duke’s Mi Gente Boycott of their Latino Student Recruitment Weekend. Both wielded “silent gestures,” that shrewdly interfered with white noise of numbed negligence. Finally, ‘taking the mask off’ that are her functionalities, the CODA expounds on ethnopoet’s interiority, particularly after the rapid re-calibration of her politics. Through a rerun of El Chavo del Ocho, one of Mexican television’s most cherished shows, we tune into the heart-breaking indigence of barrio residents, only to marvel at the power of humor to, as Friday’s John Witherspoon put it, “fight another day.” This thesis is the tip of my tongue. Y por una vez, déjala que cante.
Resumo:
Cold-water corals are amongst the most three-dimensionally complex deep-sea habitats known and are associated with high local biodiversity. Despite their importance as ecosystem engineers, little is known about how these organisms will respond to projected ocean acidification. Since preindustrial times, average ocean pH has already decreased from 8.2 to ~ 8.1. Predicted CO2 emissions will decrease this by up to another 0.3 pH units by the end of the century. This decrease in pH may have a wide range of impacts upon marine life, and in particular upon calcifiers such as cold-water corals. Lophelia pertusa is the most widespread cold-water coral (CWC) species, frequently found in the North Atlantic. Data here relate to a short term data set (21 days) on metabolism and net calcification rates of freshly collected L. pertusa from Mingulay Reef Complex, Scotland. These data from freshly collected L. pertusa from the Mingulay Reef Complex will help define the impact of ocean acidification upon the growth, physiology and structural integrity of this key reef framework forming species.
Resumo:
It has been proposed that increasing levels of pCO2 in the surface ocean will lead to more partitioning of the organic carbon fixed by marine primary production into the dissolved rather than the particulate fraction. This process may result in enhanced accumulation of dissolved organic carbon (DOC) in the surface ocean and/or concurrent accumulation of transparent exopolymer particles (TEPs), with important implications for the functioning of the marine carbon cycle. We investigated this in shipboard bioassay experiments that considered the effect of four different pCO2 scenarios (ambient, 550, 750 and 1000 µatm) on unamended natural phytoplankton communities from a range of locations in the northwest European shelf seas. The environmental settings, in terms of nutrient availability, phytoplankton community structure and growth conditions, varied considerably between locations. We did not observe any strong or consistent effect of pCO2 on DOC production. There was a significant but highly variable effect of pCO2 on the production of TEPs. In three of the five experiments, variation of TEP production between pCO2 treatments was caused by the effect of pCO2 on phytoplankton growth rather than a direct effect on TEP production. In one of the five experiments, there was evidence of enhanced TEP production at high pCO2 (twice as much production over the 96 h incubation period in the 750 ?atm treatment compared with the ambient treatment) independent of indirect effects, as hypothesised by previous studies. Our results suggest that the environmental setting of experiments (community structure, nutrient availability and occurrence of phytoplankton growth) is a key factor determining the TEP response to pCO2 perturbations.
Resumo:
The research addresses the impact of long-term reward patterns on contents of personal work goals among young Finnish managers (N = 747). Reward patterns were formed on the basis of perceived and objective career rewards (i.e., career stability and promotions) across four measurements (years 2006 –2012). Goals were measured in 2012 and classified into categories of competence, progression, well-being, job change, job security, organization, and financial goals. The factor mixture analysis identified a three-class solution as the best model of reward patterns: High rewards (77%); Increasing rewards (17%); and Reducing rewards (7%). Participants with Reducing rewards reported more progression, well-being, job change and financial goals than participants with High rewards as well as fewer competence and organizational goals than participants with Increasing rewards. Workplace resources can be in a key role in facilitating goals towards building competence and organizational performance.
Resumo:
Triggered by recent flood catastrophes and increasing concerns about climate change, scientists as well as policy makers increasingly call for making long-term water policies to enable a transformation towards flood resilience. A key question is how to make these long-term policies adaptive so that they are able to deal with uncertainties and changing circumstances. The paper proposes three conditions for making long-term water policies adaptive, which are then used to evaluate a new Dutch water policy approach called ‘Adaptive Delta Management’. Analysing this national policy approach and its translation to the Rotterdam region reveals that Dutch policymakers are torn between adaptability and the urge to control. Reflecting on this dilemma, the paper suggests a stronger focus on monitoring and learning to strengthen the adaptability of long-term water policies. Moreover, increasing the adaptive capacity of society also requires a stronger engagement with local stakeholders including citizens and businesses.
Resumo:
Phytoplankton are crucial to marine ecosystem functioning and are important indicators of environmental change. Phytoplankton data are also essential for informing management and policy, particularly in supporting the new generation of marine legislative drivers, which take a holistic ecosystem approach to management. The Marine Strategy Framework Directive (MSFD) seeks to achieve Good Environmental Status (GES) of European seas through the implementation of such a management approach. This is a regional scale directive which recognises the importance of plankton communities in marine ecosystems; plankton data at the appropriate spatial, temporal and taxonomic scales are therefore required for implementation. The Continuous Plankton Recorder (CPR) survey is a multidecadal, North Atlantic basin scale programme which routinely records approximately 300 phytoplankton taxa. Because of these attributes, the survey plays a key role in the implementation of the MSFD and the assessment of GES in the Northeast Atlantic region. This paper addresses the role of the CPR's phytoplankton time-series in delivering GES through the development and informing of MSFD indicators, the setting of targets against a background of climate change and the provision of supporting information used to interpret change in non-plankton indicators. We also discuss CPR data in the context of other phytoplankton data types that may contribute to GES, as well as explore future possibilities for the use of new and innovative applications of CPR phytoplankton datasets in delivering GES. Efforts must be made to preserve long-term time series, such as the CPR, which supply vital ecological information used to informed evidence-based environmental policy.