51 resultados para Technical Specification
Resumo:
A simple procedure was developed for packing PicoFrit HPLC columns with chromatographic stationary phase using a reservoir fabricated from standard laboratory HPLC fittings. Packed columns were mounted onto a stainless steel ultra-low volume precolumn filter assembly containing a 0.5-mu m pore size steel frit. This format provided a conduit for the application of the nanospray voltage and protected the column from obstruction by sample material. The system was characterised and operational performance assessed by analysis of a range of peptide standards (n = 9).
Resumo:
This Atlas presents statistical analyses of the simulations submitted to the Aqua-Planet Experiment (APE) data archive. The simulations are from global Atmospheric General Circulation Models (AGCM) applied to a water-covered earth. The AGCMs include ones actively used or being developed for numerical weather prediction or climate research. Some are mature, application models and others are more novel and thus less well tested in Earth-like applications. The experiment applies AGCMs with their complete parameterization package to an idealization of the planet Earth which has a greatly simplified lower boundary that consists of an ocean only. It has no land and its associated orography, and no sea ice. The ocean is represented by Sea Surface Temperatures (SST) which are specified everywhere with simple, idealized distributions. Thus in the hierarchy of tests available for AGCMs, APE falls between tests with simplified forcings such as those proposed by Held and Suarez (1994) and Boer and Denis (1997) and Earth-like simulations of the Atmospheric Modeling Intercomparison Project (AMIP, Gates et al., 1999). Blackburn and Hoskins (2013) summarize the APE and its aims. They discuss where the APE fits within a modeling hierarchy which has evolved to evaluate complete models and which provides a link between realistic simulation and conceptual models of atmospheric phenomena. The APE bridges a gap in the existing hierarchy. The goals of APE are to provide a benchmark of current model behaviors and to stimulate research to understand the cause of inter-model differences., APE is sponsored by the World Meteorological Organization (WMO) joint Commission on Atmospheric Science (CAS), World Climate Research Program (WCRP) Working Group on Numerical Experimentation (WGNE). Chapter 2 of this Atlas provides an overview of the specification of the eight APE experiments and of the data collected. Chapter 3 lists the participating models and includes brief descriptions of each. Chapters 4 through 7 present a wide variety of statistics from the 14 participating models for the eight different experiments. Additional intercomparison figures created by Dr. Yukiko Yamada in AGU group are available at http://www.gfd-dennou.org/library/ape/comparison/. This Atlas is intended to present and compare the statistics of the APE simulations but does not contain a discussion of interpretive analyses. Such analyses are left for journal papers such as those included in the Special Issue of the Journal of the Meteorological Society of Japan (2013, Vol. 91A) devoted to the APE. Two papers in that collection provide an overview of the simulations. One (Blackburn et al., 2013) concentrates on the CONTROL simulation and the other (Williamson et al., 2013) on the response to changes in the meridional SST profile. Additional papers provide more detailed analysis of the basic simulations, while others describe various sensitivities and applications. The APE experiment data base holds a wealth of data that is now publicly available from the APE web site: http://climate.ncas.ac.uk/ape/. We hope that this Atlas will stimulate future analyses and investigations to understand the large variation seen in the model behaviors.
Resumo:
We describe ncWMS, an implementation of the Open Geospatial Consortium’s Web Map Service (WMS) specification for multidimensional gridded environmental data. ncWMS can read data in a large number of common scientific data formats – notably the NetCDF format with the Climate and Forecast conventions – then efficiently generate map imagery in thousands of different coordinate reference systems. It is designed to require minimal configuration from the system administrator and, when used in conjunction with a suitable client tool, provides end users with an interactive means for visualizing data without the need to download large files or interpret complex metadata. It is also used as a “bridging” tool providing interoperability between the environmental science community and users of geographic information systems. ncWMS implements a number of extensions to the WMS standard in order to fulfil some common scientific requirements, including the ability to generate plots representing timeseries and vertical sections. We discuss these extensions and their impact upon present and future interoperability. We discuss the conceptual mapping between the WMS data model and the data models used by gridded data formats, highlighting areas in which the mapping is incomplete or ambiguous. We discuss the architecture of the system and particular technical innovations of note, including the algorithms used for fast data reading and image generation. ncWMS has been widely adopted within the environmental data community and we discuss some of the ways in which the software is integrated within data infrastructures and portals.
Resumo:
The Normal Quantile Transform (NQT) has been used in many hydrological and meteorological applications in order to make the Cumulated Distribution Function (CDF) of the observed, simulated and forecast river discharge, water level or precipitation data Gaussian. It is also the heart of the meta-Gaussian model for assessing the total predictive uncertainty of the Hydrological Uncertainty Processor (HUP) developed by Krzysztofowicz. In the field of geo-statistics this transformation is better known as the Normal-Score Transform. In this paper some possible problems caused by small sample sizes when applying the NQT in flood forecasting systems will be discussed and a novel way to solve the problem will be outlined by combining extreme value analysis and non-parametric regression methods. The method will be illustrated by examples of hydrological stream-flow forecasts.
Resumo:
Fluorescence is a troublesome side effect in laboratory Raman studies on sulfuric acid solutions and aerosol particles. We performed experiments showing that organic matter induces fluorescence in H2SO4/H2O solutions. The intensity of the fluorescence signal appears to be almost independent of the concentration of the organic substances, but depends strongly on the sulfuric acid concentration. The ubiquity of organic substances in the atmosphere, their relatively high abundance, and the insensitivity of the fluorescence with respect to their concentrations will render most acidic natural aerosols subject to absorption and fluorescence, possibly influencing climate forcing. We show that, while fluorescence may in the future become a valuable tool of aerosol diagnostics, the concurrent absorption is too small to significantly affect the atmosphere's radiative balance.
Resumo:
Many in vitro systems used to examine multipotential neural progenitor cells (NPCs) rely on mitogens including fibroblast growth factor 2 (FGF2) for their continued expansion. However, FGF2 has also been shown to alter the expression of transcription factors (TFs) that determine cell fate. Here, we report that NPCs from the embryonic telencephalon grown without FGF2 retain many of their in vivo characteristics, making them a good model for investigating molecular mechanisms involved in cell fate specification and differentiation. However, exposure of cortical NPCs to FGF2 results in a profound change in the types of neurons generated, switching them from a glutamatergic to a GABAergic phenotype. This change closely correlates with the dramatic upregulation of TFs more characteristic of ventral telencephalic NPCs. In addition, exposure of cortical NPCs to FGF2 maintains their neurogenic potential in vitro, and NPCs spontaneously undergo differentiation following FGF2 withdrawal. These results highlight the importance of TFs in determining the types of neurons generated by NPCs in vitro. In addition, they show that FGF2, as well as acting as a mitogen, changes the developmental capabilities of NPCs. These findings have implications for the cell fate specification of in vitro-expanded NPCs and their ability to generate specific cell types for therapeutic applications. Disclosure of potential conflicts of interest is found at the end of this article.
Resumo:
The objective of this paper is to apply the mis-specification (M-S) encompassing perspective to the problem of choosing between linear and log-linear unit-root models. A simple M-S encompassing test, based on an auxiliary regression stemming from the conditional second moment, is proposed and its empirical size and power are investigated using Monte Carlo simulations. It is shown that by focusing on the conditional process the sampling distributions of the relevant statistics are well behaved under both the null and alternative hypotheses. The proposed M-S encompassing test is illustrated using US total disposable income quarterly data.
Resumo:
This chapter explores the distinctive qualities of the Matt Smith era Doctor Who, focusing on how dramatic emphases are connected with emphases on visual style, and how this depends on the programme's production methods and technologies. Doctor Who was first made in the 1960s era of live, studio-based, multi-camera television with monochrome pictures. However, as technical innovations like colour filming, stereo sound, CGI and post-production effects technology have been routinely introduced into the programme, and now High Definition (HD) cameras, they have given Doctor Who’s creators new ways of making visually distinctive narratives. Indeed, it has been argued that since the 1980s television drama has become increasingly like cinema in its production methods and aesthetic aims. Viewers’ ability to view the programme on high-specification TV sets, and to record and repeat episodes using digital media, also encourage attention to visual style in television as much as in cinema. The chapter evaluates how these new circumstances affect what Doctor Who has become and engages with arguments that visual style has been allowed to override characterisation and story in the current Doctor Who. The chapter refers to specific episodes, and frames the analysis with reference to earlier years in Doctor Who’s long history. For example, visual spectacle using green-screen and CGI can function as a set-piece (at the opening or ending of an episode) but can also work ‘invisibly’ to render a setting realistically. Shooting on location using HD cameras provides a rich and detailed image texture, but also highlights mistakes and especially problems of lighting. The reduction of Doctor Who’s budget has led to Steven Moffat’s episodes relying less on visual extravagance, connecting back both to Russell T. Davies’s concern to show off the BBC’s investment in the series but also to reference British traditions of gritty and intimate social drama. Pressures to capitalise on Doctor Who as a branded product are the final aspect of the chapter’s analysis, where the role of Moffat as ‘showrunner’ links him to an American (not British) style of television production where the preservation of format and brand values give him unusual power over the look of the series.
Resumo:
Tremendous progress in plant proteomics driven by mass spectrometry (MS) techniques has been made since 2000 when few proteomics reports were published and plant proteomics was in its infancy. These achievements include the refinement of existing techniques and the search for new techniques to address food security, safety, and health issues. It is projected that in 2050, the world’s population will reach 9–12 billion people demanding a food production increase of 34–70% (FAO, 2009) from today’s food production. Provision of food in a sustainable and environmentally committed manner for such a demand without threatening natural resources, requires that agricultural production increases significantly and that postharvest handling and food manufacturing systems become more efficient requiring lower energy expenditure, a decrease in postharvest losses, less waste generation and food with longer shelf life. There is also a need to look for alternative protein sources to animal based (i.e., plant based) to be able to fulfill the increase in protein demands by 2050. Thus, plant biology has a critical role to play as a science capable of addressing such challenges. In this review, we discuss proteomics especially MS, as a platform, being utilized in plant biology research for the past 10 years having the potential to expedite the process of understanding plant biology for human benefits. The increasing application of proteomics technologies in food security, analysis, and safety is emphasized in this review. But, we are aware that no unique approach/technology is capable to address the global food issues. Proteomics-generated information/resources must be integrated and correlated with other omics-based approaches, information, and conventional programs to ensure sufficient food and resources for human development now and in the future.
Resumo:
A number of recent papers have employed the BDS test as a general test for mis-specification for linear and nonlinear models. We show that for a particular class of conditionally heteroscedastic models, the BDS test is unable to detect a common mis-specification. Our results also demonstrate that specific rather than portmanteau diagnostics are required to detect neglected asymmetry in volatility. However for both classes of tests reasonable power is only obtained using very large sample sizes.
Resumo:
This note describes a simple procedure for removing unphysical temporal discontinuities in ERA-Interim upper stratospheric global mean temperatures in March 1985 and August 1998 that have arisen due to changes in satellite radiance data used in the assimilation. The derived temperature adjustments (offsets) are suitable for use in stratosphere-resolving chemistry-climate models that are nudged (relaxed) to ERA-Interim winds and temperatures. Simulations using a nudged version of the Canadian Middle Atmosphere Model (CMAM) show that the inclusion of the temperature adjustments produces temperature time series that are devoid of the large jumps in 1985 and 1998. Due to its strong temperature dependence, the simulated upper stratospheric ozone is also shown to vary smoothly in time, unlike in a nudged simulation without the adjustments where abrupt changes in ozone occur at the times of the temperature jumps. While the adjustments to the ERA-Interim temperatures remove significant artefacts in the nudged CMAM simulation, spurious transient effects that arise due to water vapour and persist for about 5 yr after the 1979 switch to ERA-Interim data are identified, underlining the need for caution when analysing trends in runs nudged to reanalyses.
Resumo:
Purpose – The purpose of this paper is to demonstrate analytically how entrepreneurial action as learning relating to diversifying into technical clothing – i.e. a high-value manufacturing sector – can take place. This is particularly relevant to recent discussion and debate in academic and policy-making circles concerning the survival of the clothing manufacture industry in developed industrialised countries. Design/methodology/approach – Using situated learning theory (SLT) as the major analytical lens, this case study examines an episode of entrepreneurial action relating to diversification into a high-value manufacturing sector. It is considered on instrumentality grounds, revealing wider tendencies in the management of knowledge and capabilities requisite for effective entrepreneurial action of this kind. Findings – Boundary events, brokers, boundary objects, membership structures and inclusive participation that addresses power asymmetries are found to be crucial organisational design elements, enabling the development of inter- and intracommunal capacities. These together constitute a dynamic learning capability, which underpins entrepreneurial action, such as diversification into high-value manufacturing sectors. Originality/value – Through a refinement of SLT in the context of entrepreneurial action, the paper contributes to an advancement of a substantive theory of managing technological knowledge and capabilities for effective diversification into high-value manufacturing sectors.