971 resultados para Linked Data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lasers play an important role for medical, sensoric and data storage devices. This thesis is focused on design, technology development, fabrication and characterization of hybrid ultraviolet Vertical-Cavity Surface-Emitting Lasers (UV VCSEL) with organic laser-active material and inorganic distributed Bragg reflectors (DBR). Multilayer structures with different layer thicknesses, refractive indices and absorption coefficients of the inorganic materials were studied using theoretical model calculations. During the simulations the structure parameters such as materials and thicknesses have been varied. This procedure was repeated several times during the design optimization process including also the feedback from technology and characterization. Two types of VCSEL devices were investigated. The first is an index coupled structure consisting of bottom and top DBR dielectric mirrors. In the space in between them is the cavity, which includes active region and defines the spectral gain profile. In this configuration the maximum electrical field is concentrated in the cavity and can destroy the chemical structure of the active material. The second type of laser is a so called complex coupled VCSEL. In this structure the active material is placed not only in the cavity but also in parts of the DBR structure. The simulations show that such a distribution of the active material reduces the required pumping power for reaching lasing threshold. High efficiency is achieved by substituting the dielectric material with high refractive index for the periods closer to the cavity. The inorganic materials for the DBR mirrors have been deposited by Plasma- Enhanced Chemical Vapor Deposition (PECVD) and Dual Ion Beam Sputtering (DIBS) machines. Extended optimizations of the technological processes have been performed. All the processes are carried out in a clean room Class 1 and Class 10000. The optical properties and the thicknesses of the layers are measured in-situ by spectroscopic ellipsometry and spectroscopic reflectometry. The surface roughness is analyzed by atomic force microscopy (AFM) and images of the devices are taken with scanning electron microscope (SEM). The silicon dioxide (SiO2) and silicon nitride (Si3N4) layers deposited by the PECVD machine show defects of the material structure and have higher absorption in the ultra violet range compared to ion beam deposition (IBD). This results in low reflectivity of the DBR mirrors and also reduces the optical properties of the VCSEL devices. However PECVD has the advantage that the stress in the layers can be tuned and compensated, in contrast to IBD at the moment. A sputtering machine Ionsys 1000 produced by Roth&Rau company, is used for the deposition of silicon dioxide (SiO2), silicon nitride (Si3N4), aluminum oxide (Al2O3) and zirconium dioxide (ZrO2). The chamber is equipped with main (sputter) and assisted ion sources. The dielectric materials were optimized by introducing additional oxygen and nitrogen into the chamber. DBR mirrors with different material combinations were deposited. The measured optical properties of the fabricated multilayer structures show an excellent agreement with the results of theoretical model calculations. The layers deposited by puttering show high compressive stress. As an active region a novel organic material with spiro-linked molecules is used. Two different materials have been evaporated by utilizing a dye evaporation machine in the clean room of the department Makromolekulare Chemie und Molekulare Materialien (mmCmm). The Spiro-Octopus-1 organic material has a maximum emission at the wavelength λemission = 395 nm and the Spiro-Pphenal has a maximum emission at the wavelength λemission = 418 nm. Both of them have high refractive index and can be combined with low refractive index materials like silicon dioxide (SiO2). The sputtering method shows excellent optical quality of the deposited materials and high reflection of the multilayer structures. The bottom DBR mirrors for all VCSEL devices were deposited by the DIBS machine, whereas the top DBR mirror deposited either by PECVD or by combination of PECVD and DIBS. The fabricated VCSEL structures were optically pumped by nitrogen laser at wavelength λpumping = 337 nm. The emission was measured by spectrometer. A radiation of the VCSEL structure at wavelength 392 nm and 420 nm is observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Linked Open data – a platform for modern science, engineering, education and business. In the more recent talk, Sir Nigel Shadbolt speaks about "The Value of Openess - The Open Data Institute and Publically Funded Open Data" during the Natural History Museum of London Informatics Horizons event.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study aimed at comparing social representations structures concerning data collection procedures: through internet forms, diffused in the WWW, and through conventional paper and pencil questionnaire methods. overall 893 individuals participated in the research, 58% of whom were female. A total of 217 questionnaires about the social representation on football (soccer) and 218 about the representation on aging were answered by Brazilian university students in classrooms. Electronic versions of the same instrument were diffused through an internet forum linked to the same university. There were 238 answers for the football questionnaire and 230 for the aging one. The instrument asked participants to indicate five words or expressions related to one of the social objects. Sample characteristics and structural analyses were carried out separately for the two data collection procedures. data indicated that internet-based research allows for higher sample diversity, but it is essential to guarantee the adoption of measures that can select only desired participants. Results also pointed out the need to take into account the nature of the social object to be investigated through internet research on representations, seeking to avoid self-selection effects, which can bias results, as it seems to have happened with the football social object.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As an immunogen of the coronavirus, the nucleoprotein (N) is a potential antigen for the serological monitoring of infectious bronchitis virus (IBV). In this report, recombinant N protein from the Beaudette strain of IBV was produced and purified from Escherichia coli as well as Sf9 ( insect) cells, and used for the coating of enzyme-linked immunosorbent assay ( ELISA) plates. The N protein produced in Sf9 cells was phosphorylated whereas N protein from E. coli was not. Our data indicated that N protein purified from E. coli was more sensitive to anti-IBV serum than the protein from Sf9 cells. The recombinant N protein did not react with the antisera to other avian pathogens, implying that it was specific in the recognition of IBV antibodies. In addition, the data from the detection of field samples and IBV strains indicated that using the recombinant protein as coating antigen could achieve an equivalent performance to an ELISA kit based on infected material extracts as a source of antigen(s). ELISAs based on recombinant proteins are safe ( no live virus), clean ( only virus antigens are present), specific ( single proteins can be used) and rapid ( to respond to new viral strains and strains that cannot necessarily be easily cultured).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A wireless sensor network (WSN) is a group of sensors linked by wireless medium to perform distributed sensing tasks. WSNs have attracted a wide interest from academia and industry alike due to their diversity of applications, including home automation, smart environment, and emergency services, in various buildings. The primary goal of a WSN is to collect data sensed by sensors. These data are characteristic of being heavily noisy, exhibiting temporal and spatial correlation. In order to extract useful information from such data, as this paper will demonstrate, people need to utilise various techniques to analyse the data. Data mining is a process in which a wide spectrum of data analysis methods is used. It is applied in the paper to analyse data collected from WSNs monitoring an indoor environment in a building. A case study is given to demonstrate how data mining can be used to optimise the use of the office space in a building.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Variational data assimilation systems for numerical weather prediction rely on a transformation of model variables to a set of control variables that are assumed to be uncorrelated. Most implementations of this transformation are based on the assumption that the balanced part of the flow can be represented by the vorticity. However, this assumption is likely to break down in dynamical regimes characterized by low Burger number. It has recently been proposed that a variable transformation based on potential vorticity should lead to control variables that are uncorrelated over a wider range of regimes. In this paper we test the assumption that a transform based on vorticity and one based on potential vorticity produce an uncorrelated set of control variables. Using a shallow-water model we calculate the correlations between the transformed variables in the different methods. We show that the control variables resulting from a vorticity-based transformation may retain large correlations in some dynamical regimes, whereas a potential vorticity based transformation successfully produces a set of uncorrelated control variables. Calculations of spatial correlations show that the benefit of the potential vorticity transformation is linked to its ability to capture more accurately the balanced component of the flow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Widespread reports of low pollination rates suggest a recent anthropogenic decline in pollination that could threaten natural and agricultural ecosystems. Nevertheless, unequivocal evidence for a decline in pollination over time has remained elusive because it was not possible to determine historical pollination rates. Here we demonstrate a widely applicable method for reconstructing historical pollination rates, thus allowing comparison with contemporary rates from the same sites. We focused on the relationship between the oil-collecting bee Rediviva peringueyi (Melittidae) and the guild of oil-secreting orchid species (Coryciinae) that depends on it for pollination. The guild is distributed across the highly transformed and fragmented lowlands of the Cape Region of South Africa. We show that rehydrated herbarium specimens of Pterygodium catholicum, the most abundant member of the guild, contain a record of past pollinator activity in the form of pollinarium removal rates. Analysis of a pollination time series showed a recent decline in pollination on Signal Hill, a small urban conservation area. The same herbaria contain historical species occurrence data. We analyzed this data and found that there has been a contemporaneous shift in orchid guild composition in urban areas due to the local extirpation of the non-clonal species, consistent with their greater dependence on seeds and pollination for population persistence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid expansion of the TMT sector in the late 1990s and more recent growing regulatory and corporate focus on business continuity and security have raised the profile of data centres. Data centres offer a unique blend of occupational, physical and technological characteristics compared to conventional real estate assets. Limited trading and heterogeneity of data centres also causes higher levels of appraisal uncertainty. In practice, the application of conventional discounted cash flow approaches requires information about a wide range of inputs that is difficult to derive from limited market signals or estimate analytically. This paper outlines an approach that uses pricing signals from similar traded cash flows is proposed. Based upon ‘the law of one price’, the method draws upon the premise that two identical future cash flows must have the same value now. Given the difficulties of estimating exit values, an alternative is that the expected cash flows of data centre are analysed over the life cycle of the building, with corporate bond yields used to provide a proxy for the appropriate discount rates for lease income. Since liabilities are quite diverse, a number of proxies are suggested as discount and capitalisation rates including indexed-linked, fixed interest and zero-coupon bonds. Although there are rarely assets that have identical cash flows and some approximation is necessary, the level of appraiser subjectivity is dramatically reduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyses the appraisal of a specialized form of real estate - data centres - that has a unique blend of locational, physical and technological characteristics that differentiate it from conventional real estate assets. Market immaturity, limited trading and a lack of pricing signals enhance levels of appraisal uncertainty and disagreement relative to conventional real estate assets. Given the problems of applying standard discounted cash flow, an approach to appraisal is proposed that uses pricing signals from traded cash flows that are similar to the cash flows generated from data centres. Based upon ‘the law of one price’, it is assumed that two assets that are expected to generate identical cash flows in the future must have the same value now. It is suggested that the expected cash flow of assets should be analysed over the life cycle of the building. Corporate bond yields are used to provide a proxy for the appropriate discount rates for lease income. Since liabilities are quite diverse, a number of proxies are suggested as discount and capitalisation rates including indexed-linked, fixed interest and zero-coupon bonds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For thousands of years, humans have inhabited locations that are highly vulnerable to the impacts of climate change, earthquakes, and floods. In order to investigate the extent to which Holocene environmental changes may have impacted on cultural evolution, we present new geologic, geomorphic, and chronologic data from the Qazvin Plain in northwest Iran that provides a backdrop of natural environmental changes for the simultaneous cultural dynamics observed on the Central Iranian Plateau. Well-resolved archaeological data from the neighbouring settlements of Zagheh (7170—6300 yr BP), Ghabristan (6215—4950 yr BP) and Sagzabad (4050—2350 yr BP) indicate that Holocene occupation of the Hajiarab alluvial fan was interrupted by a 900 year settlement hiatus. Multiproxy climate data from nearby lakes in northwest Iran suggest a transition from arid early-Holocene conditions to more humid middle-Holocene conditions from c. 7550 to 6750 yr BP, coinciding with the settlement of Zagheh, and a peak in aridity at c. 4550 yr BP during the settlement hiatus. Palaeoseismic investigations indicate that large active fault systems in close proximity to the tell sites incurred a series of large (MW ~7.1) earthquakes with return periods of ~500—1000 years during human occupation of the tells. Mapping and optically stimulated luminescence (OSL) chronology of the alluvial sequences reveals changes in depositional style from coarse-grained unconfined sheet flow deposits to proximal channel flow and distally prograding alluvial deposits sometime after c. 8830 yr BP, possibly reflecting an increase in moisture following the early-Holocene arid phase. The coincidence of major climate changes, earthquake activity, and varying sedimentation styles with changing patterns of human occupation on the Hajiarab fan indicate links between environmental and anthropogenic systems. However, temporal coincidence does not necessitate a fundamental causative dependency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a model-data fusion (MDF) inter-comparison project (REFLEX), which compared various algorithms for estimating carbon (C) model parameters consistent with both measured carbon fluxes and states and a simple C model. Participants were provided with the model and with both synthetic net ecosystem exchange (NEE) of CO2 and leaf area index (LAI) data, generated from the model with added noise, and observed NEE and LAI data from two eddy covariance sites. Participants endeavoured to estimate model parameters and states consistent with the model for all cases over the two years for which data were provided, and generate predictions for one additional year without observations. Nine participants contributed results using Metropolis algorithms, Kalman filters and a genetic algorithm. For the synthetic data case, parameter estimates compared well with the true values. The results of the analyses indicated that parameters linked directly to gross primary production (GPP) and ecosystem respiration, such as those related to foliage allocation and turnover, or temperature sensitivity of heterotrophic respiration, were best constrained and characterised. Poorly estimated parameters were those related to the allocation to and turnover of fine root/wood pools. Estimates of confidence intervals varied among algorithms, but several algorithms successfully located the true values of annual fluxes from synthetic experiments within relatively narrow 90% confidence intervals, achieving >80% success rate and mean NEE confidence intervals <110 gC m−2 year−1 for the synthetic case. Annual C flux estimates generated by participants generally agreed with gap-filling approaches using half-hourly data. The estimation of ecosystem respiration and GPP through MDF agreed well with outputs from partitioning studies using half-hourly data. Confidence limits on annual NEE increased by an average of 88% in the prediction year compared to the previous year, when data were available. Confidence intervals on annual NEE increased by 30% when observed data were used instead of synthetic data, reflecting and quantifying the addition of model error. Finally, our analyses indicated that incorporating additional constraints, using data on C pools (wood, soil and fine roots) would help to reduce uncertainties for model parameters poorly served by eddy covariance data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data assimilation (DA) systems are evolving to meet the demands of convection-permitting models in the field of weather forecasting. On 19 April 2013 a special interest group meeting of the Royal Meteorological Society brought together UK researchers looking at different aspects of the data assimilation problem at high resolution, from theory to applications, and researchers creating our future high resolution observational networks. The meeting was chaired by Dr Sarah Dance of the University of Reading and Dr Cristina Charlton-Perez from the MetOffice@Reading. The purpose of the meeting was to help define the current state of high resolution data assimilation in the UK. The workshop assembled three main types of scientists: observational network specialists, operational numerical weather prediction researchers and those developing the fundamental mathematical theory behind data assimilation and the underlying models. These three working areas are intrinsically linked; therefore, a holistic view must be taken when discussing the potential to make advances in high resolution data assimilation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Population modelling is increasingly recognised as a useful tool for pesticide risk assessment. For vertebrates that may ingest pesticides with their food, such as woodpigeon (Columba palumbus), population models that simulate foraging behaviour explicitly can help predicting both exposure and population-level impact. Optimal foraging theory is often assumed to explain the individual-level decisions driving distributions of individuals in the field, but it may not adequately predict spatial and temporal characteristics of woodpigeon foraging because of the woodpigeons’ excellent memory, ability to fly long distances, and distinctive flocking behaviour. Here we present an individual-based model (IBM) of the woodpigeon. We used the model to predict distributions of foraging woodpigeons that use one of six alternative foraging strategies: optimal foraging, memory-based foraging and random foraging, each with or without flocking mechanisms. We used pattern-oriented modelling to determine which of the foraging strategies is best able to reproduce observed data patterns. Data used for model evaluation were gathered during a long-term woodpigeon study conducted between 1961 and 2004 and a radiotracking study conducted in 2003 and 2004, both in the UK, and are summarised here as three complex patterns: the distributions of foraging birds between vegetation types during the year, the number of fields visited daily by individuals, and the proportion of fields revisited by them on subsequent days. The model with a memory-based foraging strategy and a flocking mechanism was the only one to reproduce these three data patterns, and the optimal foraging model produced poor matches to all of them. The random foraging strategy reproduced two of the three patterns but was not able to guarantee population persistence. We conclude that with the memory-based foraging strategy including a flocking mechanism our model is realistic enough to estimate the potential exposure of woodpigeons to pesticides. We discuss how exposure can be linked to our model, and how the model could be used for risk assessment of pesticides, for example predicting exposure and effects in heterogeneous landscapes planted seasonally with a variety of crops, while accounting for differences in land use between landscapes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A framework for understanding the complexity of cancer development was established by Hanahan and Weinberg in their definition of the hallmarks of cancer. In this review, we consider the evidence that parabens can enable development in human breast epithelial cells of 4/6 of the basic hallmarks, 1/2 of the emerging hallmarks and 1/2 of the enabling characteristics. Hallmark 1: parabens have been measured as present in 99% of human breast tissue samples, possess oestrogenic activity and can stimulate sustained proliferation of human breast cancer cells at concentrations measurable in the breast. Hallmark 2: parabens can inhibit the suppression of breast cancer cell growth by hydroxytamoxifen, and through binding to the oestrogen-related receptor gamma (ERR) may prevent its deactivation by growth inhibitors. Hallmark 3: in the 10nM to 1M range, parabens give a dose-dependent evasion of apoptosis in high-risk donor breast epithelial cells. Hallmark 4: long-term exposure (>20weeks) to parabens leads to increased migratory and invasive activity in human breast cancer cells, properties which are linked to the metastatic process. Emerging hallmark: methylparaben has been shown in human breast epithelial cells to increase mTOR, a key regulator of energy metabolism. Enabling characteristic: parabens can cause DNA damage at high concentrations in the short term but more work is needed to investigate long-term low-doses of mixtures. The ability of parabens to enable multiple cancer hallmarks in human breast epithelial cells provides grounds for regulatory review of the implications of the presence of parabens in human breast tissue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Classical regression methods take vectors as covariates and estimate the corresponding vectors of regression parameters. When addressing regression problems on covariates of more complex form such as multi-dimensional arrays (i.e. tensors), traditional computational models can be severely compromised by ultrahigh dimensionality as well as complex structure. By exploiting the special structure of tensor covariates, the tensor regression model provides a promising solution to reduce the model’s dimensionality to a manageable level, thus leading to efficient estimation. Most of the existing tensor-based methods independently estimate each individual regression problem based on tensor decomposition which allows the simultaneous projections of an input tensor to more than one direction along each mode. As a matter of fact, multi-dimensional data are collected under the same or very similar conditions, so that data share some common latent components but can also have their own independent parameters for each regression task. Therefore, it is beneficial to analyse regression parameters among all the regressions in a linked way. In this paper, we propose a tensor regression model based on Tucker Decomposition, which identifies not only the common components of parameters across all the regression tasks, but also independent factors contributing to each particular regression task simultaneously. Under this paradigm, the number of independent parameters along each mode is constrained by a sparsity-preserving regulariser. Linked multiway parameter analysis and sparsity modeling further reduce the total number of parameters, with lower memory cost than their tensor-based counterparts. The effectiveness of the new method is demonstrated on real data sets.