950 resultados para High electric fields


Relevância:

30.00% 30.00%

Publicador:

Resumo:

High aspect ratio polymeric micro-patterns are ubiquitous in many fields ranging from sensors, actuators, optics, fluidics and medical. Second generation PDMS molds are replicated against first generation silicon molds created by deep reactive ion etching. In order to ensure successful demolding, the silicon molds are coated with a thin layer of C[subscript 4]F[subscript 8] plasma polymer to reduce the adhesion force. Peel force and demolding status are used to determine if delamination is successful. Response surface method is employed to provide insights on how changes in coil power, passivating time and gas flow conditions affect plasma polymerization of C[subscript 4]F[subscript 8].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper focus on the problem of locating single-phase faults in mixed distribution electric systems, with overhead lines and underground cables, using voltage and current measurements at the sending-end and sequence model of the network. Since calculating series impedance for underground cables is not as simple as in the case of overhead lines, the paper proposes a methodology to obtain an estimation of zero-sequence impedance of underground cables starting from previous single-faults occurred in the system, in which an electric arc occurred at the fault location. For this reason, the signal is previously pretreated to eliminate its peaks voltage and the analysis can be done working with a signal as close as a sinus wave as possible

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a remote sensing method for measuring the internal interface height field in a rotating, two-layer annulus laboratory experiment. The method is non-invasive, avoiding the possibility of an interaction between the flow and the measurement device. The height fields retrieved are accurate and highly resolved in both space and time. The technique is based on a flow visualization method developed by previous workers, and relies upon the optical rotation properties of the working liquids. The previous methods returned only qualitative interface maps, however. In the present study, a technique is developed for deriving quantitative maps by calibrating height against the colour fields registered by a camera which views the flow from above. We use a layer-wise torque balance analysis to determine the equilibrium interface height field analytically, in order to derive the calibration curves. With the current system, viewing an annulus of outer radius 125 mm and depth 250 mm from a distance of 2 m, the inferred height fields have horizontal, vertical and temporal resolutions of up to 0.2 mm, 1 mm and 0.04 s, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigates variability in the intensity of the wintertime Siberian high (SH) by defining a robust SH index (SHI) and correlating it with selected meteorological fields and teleconnection indices. A dramatic trend of -2.5 hPa decade(-1) has been found in the SHI between 1978 and 2001 with unprecedented (since 1871) low values of the SHI. The weakening of the SH has been confirmed by analyzing different historical gridded analyses and individual station observations of sea level pressure (SLP) and excluding possible effects from the conversion of surface pressure to SLP. SHI correlation maps with various meteorological fields show that SH impacts on circulation and temperature patterns extend far outside the SH source area extending from the Arctic to the tropical Pacific. Advection of warm air from eastern Europe has been identified as the main mechanism causing milder than normal conditions over the Kara and Laptev Seas in association with a strong SH. Despite the strong impacts of the variability in the SH on climatic variability across the Northern Hemisphere, correlations between the SHI and the main teleconnection indices of the Northern Hemisphere are weak. Regression analysis has shown that teleconnection indices are not able to reproduce the interannual variability and trends in the SH. The inclusion of regional surface temperature in the regression model provides closer agreement between the original and reconstructed SHI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[ 1] The European Centre for Medium-Range Weather Forecasts (ECMWF) 40-year Reanalysis (ERA-40) ozone and water vapor reanalysis fields during the 1990s have been compared with independent satellite data from the Halogen Occultation Experiment (HALOE) and Microwave Limb Sounder (MLS) instruments on board the Upper Atmosphere Research Satellite (UARS). In addition, ERA-40 has been compared with aircraft data from the Measurements of Ozone and Water Vapour by Airbus In-Service Aircraft (MOZAIC) program. Overall, in comparison with the values derived from the independent observations, the upper stratosphere in ERA-40 has about 5 - 10% more ozone and 15 - 20% less water vapor. This dry bias in the reanalysis appears to be global and extends into the middle stratosphere down to 40 hPa. Most of the discrepancies and seasonal variations between ERA-40 and the independent observations occur within the upper troposphere over the tropics and the lower stratosphere over the high latitudes. ERA-40 reproduces a weaker Antarctic ozone hole, and of less vertical extent, than the independent observations; values in the ozone maximum in the tropical stratosphere are lower for the reanalysis. ERA-40 mixing ratios of water vapor are considerably larger than those for MOZAIC, typically by 20% in the tropical upper troposphere, and they may exceed 60% in the lower stratosphere over high latitudes. The results imply that the Brewer-Dobson circulation in the ECMWF reanalysis system is too fast, as is also evidenced by deficiencies in the way ERA-40 reproduces the water vapor "tape recorder'' signal in the tropical stratosphere. Finally, the paper examines the biases and their temporal variation during the 1990s in the way ERA-40 compares to the independent observations. We also discuss how the evaluation results depend on the instrument used, as well as on the version of the data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A set of filters based on the sequence of semiconductor edges is described which offers continuity of short-wave infrared blocking. The rejection throughout the stop region is greater than 103 for each filter and the transmission better than 70% through one octave with a square cutoff. The cutoff points are located at intervals of about two-thirds of an octave. Filters at 2.6 ,µm, 5.5 µm, and 12 µm which use a low-passing multilayer in combination with a semiconductor absorption edge are described in detail. The design of multilayers for optimum performance is discussed by analogy with the synthesis of electric circuit filters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Structured data represented in the form of graphs arises in several fields of the science and the growing amount of available data makes distributed graph mining techniques particularly relevant. In this paper, we present a distributed approach to the frequent subgraph mining problem to discover interesting patterns in molecular compounds. The problem is characterized by a highly irregular search tree, whereby no reliable workload prediction is available. We describe the three main aspects of the proposed distributed algorithm, namely a dynamic partitioning of the search space, a distribution process based on a peer-to-peer communication framework, and a novel receiver-initiated, load balancing algorithm. The effectiveness of the distributed method has been evaluated on the well-known National Cancer Institute’s HIV-screening dataset, where the approach attains close-to linear speedup in a network of workstations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A powerful way to test the realism of ocean general circulation models is to systematically compare observations of passive tracer concentration with model predictions. The general circulation models used in this way cannot resolve a full range of vigorous mesoscale activity (on length scales between 10–100 km). In the real ocean, however, this activity causes important variability in tracer fields. Thus, in order to rationally compare tracer observations with model predictions these unresolved fluctuations (the model variability error) must be estimated. We have analyzed this variability using an eddy‐resolving reduced‐gravity model in a simple midlatitude double‐gyre configuration. We find that the wave number spectrum of tracer variance is only weakly sensitive to the distribution of (large scale slowly varying) tracer sources and sinks. This suggests that a universal passive tracer spectrum may exist in the ocean. We estimate the spectral shape using high‐resolution measurements of potential temperature on an isopycnal in the upper northeast Atlantic Ocean, finding a slope near k −1.7 between 10 and 500 km. The typical magnitude of the variance is estimated by comparing tracer simulations using different resolutions. For CFC‐ and tritium‐type transient tracers the peak magnitude of the model variability saturation error may reach 0.20 for scales shorter than 100 km. This is of the same order as the time mean saturation itself and well over an order of magnitude greater than the instrumental uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Recent changes in European agricultural policy have led to measures to reverse the loss of species-rich grasslands through the creation of new areas on ex-arable land. Ex-arable soils are often characterized by high inorganic nitrogen (N) levels, which lead to the rapid establishment of annual and fast-growing perennial species during the initial phase of habitat creation. The addition of carbon (C) to the soil has been suggested as a countermeasure to reduce plant-available N and alter competitive interactions among plant species. 2. To test the effect of C addition on habitat creation on ex-arable land, an experiment was set up on two recently abandoned fields in Switzerland and on two 6-year-old restoration sites in the UK. Carbon was added as a mixture of either sugar and sawdust or wood chips and sawdust during a period of 2 years. The effects of C addition on soil parameters and vegetation composition were assessed during the period of C additions and 1 year thereafter. 3. Soil nitrate concentrations were reduced at all sites within weeks of the first C addition, and remained low until cessation of the C additions. The overall effect of C addition on vegetation was a reduction in above-ground biomass and cover. At the Swiss sites, the addition of sugar and sawdust led to a relative increase in legume and forb cover and to a decrease in grass cover. The soil N availability, composition of soil micro-organisms and vegetation characteristics continued to be affected after cessation of C additions. 4. Synthesis and applications. The results suggest that C addition in grassland restoration is a useful management method to reduce N availability on ex-arable land. Carbon addition alters the vegetation composition by creating gaps in the vegetation that facilitates the establishment of late-seral plant species, and is most effective when started immediately after the abandonment of arable fields and applied over several years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Weeds are major constraints on crop production, yet as part of the primary producers within farming systems, they may be important components of the agroecosystem. Using published literature, the role of weeds in arable systems for other above-ground trophic levels are examined. In the UK, there is evidence that weed flora have changed over the past century, with some species declining in abundance, whereas others have increased. There is also some evidence for a decline in the size of arable weed seedbanks. Some of these changes reflect improved agricultural efficiency, changes to more winter-sown crops in arable rotations and the use of more broad-spectrum herbicide combinations. Interrogation of a database of records of phytophagous insects associated with plant species in the UK reveals that many arable weed species support a high diversity of insect species. Reductions in abundances of host plants may affect associated insects and other taxa. A number of insect groups and farmland birds have shown marked population declines over the past 30 years. Correlational studies indicate that many of these declines are associated with changes in agricultural practices. Certainly reductions in food availability in winter and for nestling birds in spring are implicated in the declines of several bird species, notably the grey partridge, Perdix perdix . Thus weeds have a role within agroecosystems in supporting biodiversity more generally. An understanding of weed competitivity and the importance of weeds for insects and birds may allow the identification of the most important weed species. This may form the first step in balancing the needs for weed control with the requirements for biodiversity and more sustainable production methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has become evident that the mystery of life will not be deciphered just by decoding its blueprint, the genetic code. In the life and biomedical sciences, research efforts are now shifting from pure gene analysis to the analysis of all biomolecules involved in the machinery of life. One area of these postgenomic research fields is proteomics. Although proteomics, which basically encompasses the analysis of proteins, is not a new concept, it is far from being a research field that can rely on routine and large-scale analyses. At the time the term proteomics was coined, a gold-rush mentality was created, promising vast and quick riches (i.e., solutions to the immensely complex questions of life and disease). Predictably, the reality has been quite different. The complexity of proteomes and the wide variations in the abundances and chemical properties of their constituents has rendered the use of systematic analytical approaches only partially successful, and biologically meaningful results have been slow to arrive. However, to learn more about how cells and, hence, life works, it is essential to understand the proteins and their complex interactions in their native environment. This is why proteomics will be an important part of the biomedical sciences for the foreseeable future. Therefore, any advances in providing the tools that make protein analysis a more routine and large-scale business, ideally using automated and rapid analytical procedures, are highly sought after. This review will provide some basics, thoughts and ideas on the exploitation of matrix-assisted laser desorption/ ionization in biological mass spectrometry - one of the most commonly used analytical tools in proteomics - for high-throughput analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has become evident that the mystery of life will not be deciphered just by decoding its blueprint, the genetic code. In the life and biomedical sciences, research efforts are now shifting from pure gene analysis to the analysis of all biomolecules involved in the machinery of life. One area of these postgenomic research fields is proteomics. Although proteomics, which basically encompasses the analysis of proteins, is not a new concept, it is far from being a research field that can rely on routine and large-scale analyses. At the time the term proteomics was coined, a gold-rush mentality was created, promising vast and quick riches (i.e., solutions to the immensely complex questions of life and disease). Predictably, the reality has been quite different. The complexity of proteomes and the wide variations in the abundances and chemical properties of their constituents has rendered the use of systematic analytical approaches only partially successful, and biologically meaningful results have been slow to arrive. However, to learn more about how cells and, hence, life works, it is essential to understand the proteins and their complex interactions in their native environment. This is why proteomics will be an important part of the biomedical sciences for the foreseeable future. Therefore, any advances in providing the tools that make protein analysis a more routine and large-scale business, ideally using automated and rapid analytical procedures, are highly sought after. This review will provide some basics, thoughts and ideas on the exploitation of matrix-assisted laser desorption/ionization in biological mass spectrometry - one of the most commonly used analytical tools in proteomics - for high-throughput analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper has two aims. First, to present cases in which scientists developed a defensive system for their homeland: Blackett and the air defense of Britain in WWII, Forrester and the SAGE system for North America in the Cold War, and Archimedes’ work defending Syracuse during the Second Punic War. In each case the historical context and the individual’s other achievements are outlined, and a description of the contribution’s relationship to OR/MS is given. The second aim is to consider some of the features the cases share and examine them in terms of contemporary OR/MS methodology. Particular reference is made to a recent analysis of the field’s strengths and weaknesses. This allows both a critical appraisal of the field and a set of potential responses for strengthening it. Although a mixed set of lessons arise, the overall conclusion is that the cases are examples to build on and that OR/MS retains the ability to do high stakes work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With many operational centers moving toward order 1-km-gridlength models for routine weather forecasting, this paper presents a systematic investigation of the properties of high-resolution versions of the Met Office Unified Model for short-range forecasting of convective rainfall events. The authors describe a suite of configurations of the Met Office Unified Model running with grid lengths of 12, 4, and 1 km and analyze results from these models for a number of convective cases from the summers of 2003, 2004, and 2005. The analysis includes subjective evaluation of the rainfall fields and comparisons of rainfall amounts, initiation, cell statistics, and a scale-selective verification technique. It is shown that the 4- and 1-km-gridlength models often give more realistic-looking precipitation fields because convection is represented explicitly rather than parameterized. However, the 4-km model representation suffers from large convective cells and delayed initiation because the grid length is too long to correctly reproduce the convection explicitly. These problems are not as evident in the 1-km model, although it does suffer from too numerous small cells in some situations. Both the 4- and 1-km models suffer from poor representation at the start of the forecast in the period when the high-resolution detail is spinning up from the lower-resolution (12 km) starting data used. A scale-selective precipitation verification technique implies that for later times in the forecasts (after the spinup period) the 1-km model performs better than the 12- and 4-km models for lower rainfall thresholds. For higher thresholds the 4-km model scores almost as well as the 1-km model, and both do better than the 12-km model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of NWP models with grid spacing down to 1 km should produce more realistic forecasts of convective storms. However, greater realism does not necessarily mean more accurate precipitation forecasts. The rapid growth of errors on small scales in conjunction with preexisting errors on larger scales may limit the usefulness of such models. The purpose of this paper is to examine whether improved model resolution alone is able to produce more skillful precipitation forecasts on useful scales, and how the skill varies with spatial scale. A verification method will be described in which skill is determined from a comparison of rainfall forecasts with radar using fractional coverage over different sized areas. The Met Office Unified Model was run with grid spacings of 12, 4, and 1 km for 10 days in which convection occurred during the summers of 2003 and 2004. All forecasts were run from 12-km initial states for a clean comparison. The results show that the 1-km model was the most skillful over all but the smallest scales (approximately <10–15 km). A measure of acceptable skill was defined; this was attained by the 1-km model at scales around 40–70 km, some 10–20 km less than that of the 12-km model. The biggest improvement occurred for heavier, more localized rain, despite it being more difficult to predict. The 4-km model did not improve much on the 12-km model because of the difficulties of representing convection at that resolution, which was accentuated by the spinup from 12-km fields.