140 resultados para High-Order Accurate Scheme


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data are presented for a nighttime ion heating event observed by the EISCAT radar on 16 December 1988. In the experiment, the aspect angle between the radar beam and the geomagnetic field was fixed at 54.7°, which avoids any ambiguity in derived ion temperature caused by anisotropy in the ion velocity distribution function. The data were analyzed with an algorithm which takes account of the non-Maxwellian line-of-sight ion velocity distribution. During the heating event, the derived spectral distortion parameter (D∗) indicated that the distribution function was highly distorted from a Maxwellian form when the ion drift increased to 4 km s−1. The true three-dimensional ion temperature was used in the simplified ion balance equation to compute the ion mass during the heating event. The ion composition was found to change from predominantly O4 to mainly molecular ions. A theoretical analysis of the ion composition, using the MSIS86 model and published values of the chemical rate coefficients, accounts for the order-of-magnitude increase in the atomic/molecular ion ratio during the event, but does not successfully explain the very high proportion of molecular ions that was observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The techno-economic performance of a small wind turbine is very sensitive to the available wind resource. However, due to financial and practical constraints installers rely on low resolution wind speed databases to assess a potential site. This study investigates whether the two site assessment tools currently used in the UK, NOABL or the Energy Saving Trust wind speed estimator, are accurate enough to estimate the techno-economic performance of a small wind turbine. Both the tools tend to overestimate the wind speed, with a mean error of 23% and 18% for the NOABL and Energy Saving Trust tool respectively. A techno-economic assessment of 33 small wind turbines at each site has shown that these errors can have a significant impact on the estimated load factor of an installation. Consequently, site/turbine combinations which are not economically viable can be predicted to be viable. Furthermore, both models tend to underestimate the wind resource at relatively high wind speed sites, this can lead to missed opportunities as economically viable turbine/site combinations are predicted to be non-viable. These results show that a better understanding of the local wind resource is a required to make small wind turbines a viable technology in the UK.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new frontier in weather forecasting is emerging by operational forecast models now being run at convection-permitting resolutions at many national weather services. However, this is not a panacea; significant systematic errors remain in the character of convective storms and rainfall distributions. The DYMECS project (Dynamical and Microphysical Evolution of Convective Storms) is taking a fundamentally new approach to evaluate and improve such models: rather than relying on a limited number of cases, which may not be representative, we have gathered a large database of 3D storm structures on 40 convective days using the Chilbolton radar in southern England. We have related these structures to storm life-cycles derived by tracking features in the rainfall from the UK radar network, and compared them statistically to storm structures in the Met Office model, which we ran at horizontal grid length between 1.5 km and 100 m, including simulations with different subgrid mixing length. We also evaluated the scale and intensity of convective updrafts using a new radar technique. We find that the horizontal size of simulated convective storms and the updrafts within them is much too large at 1.5-km resolution, such that the convective mass flux of individual updrafts can be too large by an order of magnitude. The scale of precipitation cores and updrafts decreases steadily with decreasing grid lengths, as does the typical storm lifetime. The 200-m grid-length simulation with standard mixing length performs best over all diagnostics, although a greater mixing length improves the representation of deep convective storms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Mealybugs (Hemiptera: Coccoidea: Pseudococcidae) are key vectors of badnaviruses, including Cacao Swollen Shoot Virus (CSSV) the most damaging virus affecting cacao (Theobroma cacao L.). The effectiveness of mealybugs as virus vectors is species dependent and it is therefore vital that CSSV resistance breeding programmes in cacao incorporate accurate mealybug identification. In this work the efficacy of a CO1-based DNA barcoding approach to species identification was evaluated by screening a range of mealybugs collected from cacao in seven countries. RESULTS: Morphologically similar adult females were characterised by scanning electron microscopy and then, following DNA extraction, were screened with CO1 barcoding markers. A high degree of CO1 sequence homology was observed for all 11 individual haplotypes including those accessions from distinct geographical regions. This has allowed for the design of a High Resolution Melt (HRM) assay capable of rapid identification of the commonly encountered mealybug pests of cacao. CONCLUSIONS: HRM Analysis (HRMA) readily differentiated between mealybug pests of cacao that can not necessarily be identified by conventional morphological analysis. This new approach, therefore, has potential to facilitate breeding for resistance to CSSV and other mealybug transmitted diseases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The EU Water Framework Directive (WFD) requires that the ecological and chemical status of water bodies in Europe should be assessed, and action taken where possible to ensure that at least "good" quality is attained in each case by 2015. This paper is concerned with the accuracy and precision with which chemical status in rivers can be measured given certain sampling strategies, and how this can be improved. High-frequency (hourly) chemical data from four rivers in southern England were subsampled to simulate different sampling strategies for four parameters used for WFD classification: dissolved phosphorus, dissolved oxygen, pH and water temperature. These data sub-sets were then used to calculate the WFD classification for each site. Monthly sampling was less precise than weekly sampling, but the effect on WFD classification depended on the closeness of the range of concentrations to the class boundaries. In some cases, monthly sampling for a year could result in the same water body being assigned to three or four of the WFD classes with 95% confidence, due to random sampling effects, whereas with weekly sampling this was one or two classes for the same cases. In the most extreme case, the same water body could have been assigned to any of the five WFD quality classes. Weekly sampling considerably reduces the uncertainties compared to monthly sampling. The width of the weekly sampled confidence intervals was about 33% that of the monthly for P species and pH, about 50% for dissolved oxygen, and about 67% for water temperature. For water temperature, which is assessed as the 98th percentile in the UK, monthly sampling biases the mean downwards by about 1 °C compared to the true value, due to problems of assessing high percentiles with limited data. Low-frequency measurements will generally be unsuitable for assessing standards expressed as high percentiles. Confining sampling to the working week compared to all 7 days made little difference, but a modest improvement in precision could be obtained by sampling at the same time of day within a 3 h time window, and this is recommended. For parameters with a strong diel variation, such as dissolved oxygen, the value obtained, and thus possibly the WFD classification, can depend markedly on when in the cycle the sample was taken. Specifying this in the sampling regime would be a straightforward way to improve precision, but there needs to be agreement about how best to characterise risk in different types of river. These results suggest that in some cases it will be difficult to assign accurate WFD chemical classes or to detect likely trends using current sampling regimes, even for these largely groundwater-fed rivers. A more critical approach to sampling is needed to ensure that management actions are appropriate and supported by data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Industrial robotic manipulators can be found in most factories today. Their tasks are accomplished through actively moving, placing and assembling parts. This movement is facilitated by actuators that apply a torque in response to a command signal. The presence of friction and possibly backlash have instigated the development of sophisticated compensation and control methods in order to achieve the desired performance may that be accurate motion tracking, fast movement or in fact contact with the environment. This thesis presents a dual drive actuator design that is capable of physically linearising friction and hence eliminating the need for complex compensation algorithms. A number of mathematical models are derived that allow for the simulation of the actuator dynamics. The actuator may be constructed using geared dc motors, in which case the benefits of torque magnification is retained whilst the increased non-linear friction effects are also linearised. An additional benefit of the actuator is the high quality, low latency output position signal provided by the differencing of the two drive positions. Due to this and the linearised nature of friction, the actuator is well suited for low velocity, stop-start applications, micro-manipulation and even in hard-contact tasks. There are, however, disadvantages to its design. When idle, the device uses power whilst many other, single drive actuators do not. Also the complexity of the models mean that parameterisation is difficult. Management of start-up conditions still pose a challenge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we investigate half-duplex two-way dual-hop channel state information (CSI)-assisted amplify-and-forward (AF) relaying in the presence of high-power amplifier (HPA) nonlinearity at relays. The expression for the end-to-end signal-to-noise ratio (SNR) is derived as per the modified system model by taking into account the interference caused by relaying scheme and HPA nonlinearity. The system performance of the considered relaying network is evaluated in terms of average symbol error probability (SEP) in Nakagami-$m$ fading channels, by making use of the moment-generating function (MGF) approach. Numerical results are provided and show the effects of several parameters, such as quadrature amplitude modulation (QAM) order, number of relays, HPA parameters, and Nakagami parameter, on performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The efficiency of a Wireless Power Transfer (WPT) system is greatly dependent on both the geometry and operating frequency of the transmitting and receiving structures. By using Coupled Mode Theory (CMT), the figure of merit is calculated for resonantly-coupled loop and dipole systems. An in-depth analysis of the figure of merit is performed with respect to the key geometric parameters of the loops and dipoles, along with the resonant frequency, in order to identify the key relationships leading to high-efficiency WPT. For systems consisting of two identical single-turn loops, it is shown that the choice of both the loop radius and resonant frequency are essential in achieving high-efficiency WPT. For the dipole geometries studied, it is shown that the choice of length is largely irrelevant and that as a result of their capacitive nature, low-MHz frequency dipoles are able to produce significantly higher figures of merit than those of the loops considered. The results of the figure of merit analysis are used to propose and subsequently compare two mid-range loop and dipole WPT systems of equal size and operating frequency, where it is shown that the dipole system is able to achieve higher efficiencies than the loop system of the distance range examined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Obese adults are prone to develop metabolic and cardiovascular diseases. Furthermore, over-weight expectant mothers give birth to large babies who also have increased likelihood of developing metabolic and cardiovascular diseases. Fundamental advancements to better understand the pathophysiology of obesity are critical in the development of anti-obesity therapies not only for this but also future generations. Skeletal muscle plays a major role in fat metabolism and much work has focused in promoting this activity in order to control the development of obesity. Research has evaluated myostatin inhibition as a strategy to prevent the development of obesity and concluded in some cases that it offers a protective mechanism against a high-fat diet. Results: We hypothesised that myostatin inhibition should protect not only the mother but also its developing foetus from the detrimental effects of a high-fat diet. Unexpectedly, we found muscle development was attenuated in the foetus of myostatin null mice raised on a high-fat diet. We therefore re-examined the effect of the high-fat diet on adults and found myostatin null mice were more susceptible to diet-induced obesity through a mechanism involving impairment of inter-organ fat utilization. Conclusions: Loss of myostatin alters fatty acid uptake and oxidation in skeletal muscle and liver. We show that abnormally high metabolic activity of fat in myostatin null mice is decreased by a high-fat diet resulting in excessive adipose deposition and lipotoxicity. Collectively, our genetic loss-of-function studies offer an explanation of the lean phenotype displayed by a host of animals lacking myostatin signalling. Keywords: Muscle, Obesity, High-fat diet, Metabolism, Myostatin

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The topography of many floodplains in the developed world has now been surveyed with high resolution sensors such as airborne LiDAR (Light Detection and Ranging), giving accurate Digital Elevation Models (DEMs) that facilitate accurate flood inundation modelling. This is not always the case for remote rivers in developing countries. However, the accuracy of DEMs produced for modelling studies on such rivers should be enhanced in the near future by the high resolution TanDEM-X WorldDEM. In a parallel development, increasing use is now being made of flood extents derived from high resolution Synthetic Aperture Radar (SAR) images for calibrating, validating and assimilating observations into flood inundation models in order to improve these. This paper discusses an additional use of SAR flood extents, namely to improve the accuracy of the TanDEM-X DEM in the floodplain covered by the flood extents, thereby permanently improving this DEM for future flood modelling and other studies. The method is based on the fact that for larger rivers the water elevation generally changes only slowly along a reach, so that the boundary of the flood extent (the waterline) can be regarded locally as a quasi-contour. As a result, heights of adjacent pixels along a small section of waterline can be regarded as samples with a common population mean. The height of the central pixel in the section can be replaced with the average of these heights, leading to a more accurate estimate. While this will result in a reduction in the height errors along a waterline, the waterline is a linear feature in a two-dimensional space. However, improvements to the DEM heights between adjacent pairs of waterlines can also be made, because DEM heights enclosed by the higher waterline of a pair must be at least no higher than the corrected heights along the higher waterline, whereas DEM heights not enclosed by the lower waterline must in general be no lower than the corrected heights along the lower waterline. In addition, DEM heights between the higher and lower waterlines can also be assigned smaller errors because of the reduced errors on the corrected waterline heights. The method was tested on a section of the TanDEM-X Intermediate DEM (IDEM) covering an 11km reach of the Warwickshire Avon, England. Flood extents from four COSMO-SKyMed images were available at various stages of a flood in November 2012, and a LiDAR DEM was available for validation. In the area covered by the flood extents, the original IDEM heights had a mean difference from the corresponding LiDAR heights of 0.5 m with a standard deviation of 2.0 m, while the corrected heights had a mean difference of 0.3 m with standard deviation 1.2 m. These figures show that significant reductions in IDEM height bias and error can be made using the method, with the corrected error being only 60% of the original. Even if only a single SAR image obtained near the peak of the flood was used, the corrected error was only 66% of the original. The method should also be capable of improving the final TanDEM-X DEM and other DEMs, and may also be of use with data from the SWOT (Surface Water and Ocean Topography) satellite.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biaxially oriented films produced from semi-crystalline, semi-aromatic polyesters are utilised extensively as components within various applications, including the specialist packaging, flexible electronic and photovoltaic markets. However, the thermal performance of such polyesters, specifically poly(ethylene terephthalate) (PET) and poly(ethylene-2,6-naphthalate) (PEN), is inadequate for several applications that require greater dimensional stability at higher operating temperatures. The work described in this project is therefore primarily focussed upon the copolymerisation of rigid comonomers with PET and PEN, in order to produce novel polyester-based materials that exhibit superior thermomechanical performance, with retention of crystallinity, to achieve biaxial orientation. Rigid biphenyldiimide comonomers were readily incorporated into PEN and poly(butylene-2,6-naphthalate) (PBN) via a melt-polycondensation route. For each copoly(ester-imide) series, retention of semi-crystalline behaviour is observed throughout entire copolymer composition ratios. This phenomenon may be rationalised by cocrystallisation between isomorphic biphenyldiimide and naphthalenedicarboxylate residues, which enables statistically random copolymers to melt-crystallise despite high proportions of imide sub-units being present. In terms of thermal performance, the glass transition temperature, Tg, linearly increases with imide comonomer content for both series. This facilitated the production of several high performance PEN-based biaxially oriented films, which displayed analogous drawing, barrier and optical properties to PEN. Selected PBN copoly(ester-imide)s also possess the ability to either melt-crystallise, or form a mesophase from the isotropic state depending on the applied cooling rate. An equivalent synthetic approach based upon isomorphic comonomer crystallisation was subsequently applied to PET by copolymerisation with rigid diimide and Kevlar®-type amide comonomers, to afford several novel high performance PET-based copoly(ester-imide)s and copoly(ester-amide)s that all exhibited increased Tgs. Retention of crystallinity was achieved in these copolymers by either melt-crystallisation or thermal annealing. The initial production of a semi-crystalline, PET-based biaxially oriented film with a Tg in excess of 100 °C was successful, and this material has obvious scope for further industrial scale-up and process development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is an empirical-based study of the European Union’s Emissions Trading Scheme (EU ETS) and its implications in terms of corporate environmental and financial performance. The novelty of this study includes the extended scope of the data coverage, as most previous studies have examined only the power sector. The use of verified emissions data of ETS-regulated firms as the environmental compliance measure and as the potential differentiating criteria that concern the valuation of EU ETS-exposed firms in the stock market is also an original aspect of this study. The study begins in Chapter 2 by introducing the background information on the emission trading system (ETS), which focuses on (i) the adoption of ETS as an environmental management instrument and (ii) the adoption of ETS by the European Union as one of its central climate policies. Chapter 3 surveys four databases that provide carbon emissions data in order to determine the most suitable source of the data to be used in the later empirical chapters. The first empirical chapter, which is also Chapter 4 of this thesis, investigates the determinants of the emissions compliance performance of the EU ETS-exposed firms through constructing the best possible performance ratio from verified emissions data and self-configuring models for a panel regression analysis. Chapter 5 examines the impacts on the EU ETS-exposed firms in terms of their equity valuation with customised portfolios and multi-factor market models. The research design takes into account the emissions allowance (EUA) price as an additional factor, as it has the most direct association with the EU ETS to control for the exposure. The final empirical Chapter 6 takes the investigation one step further, by specifically testing the degree of ETS exposure facing different sectors with sector-based portfolios and an extended multi-factor market model. The findings from the emissions performance ratio analysis show that the business model of firms significantly influences emissions compliance, as the capital intensity has a positive association with the increasing emissions-to-emissions cap ratio. Furthermore, different sectors show different degrees of sensitivity towards the determining factors. The production factor influences the performance ratio of the Utilities sector, but not the Energy or Materials sectors. The results show that the capital intensity has a more profound influence on the utilities sector than on the materials sector. With regard to the financial performance impact, ETS-exposed firms as aggregate portfolios experienced a substantial underperformance during the 2001–2004 period, but not in the operating period of 2005–2011. The results of the sector-based portfolios show again the differentiating effect of the EU ETS on sectors, as one sector is priced indifferently against its benchmark, three sectors see a constant underperformance, and three sectors have altered outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of three urban land surface models, run in offline mode, with their default external parameters, is evaluated for two distinctly different sites in Helsinki: Torni and Kumpula. The former is a dense city centre site with 22% vegetation, while the latter is a suburban site with over 50% vegetation. At both locations the models are compared against sensible and latent heat fluxes measured using the eddy covariance technique, along with snow depth observations. The cold climate experienced by the city causes strong seasonal variations that include snow cover and stable atmospheric conditions. Most of the time the three models are able to account for the differences between the study areas as well as the seasonal and diurnal variability of the energy balance components. However, the performances are not systematic across the modelled components, season and surface type. The net all-wave radiation is well simulated, with the greatest uncertainties related to snowmelt timing, when the fraction of snow cover has a key role, particularly in determining the surface albedo. For the turbulent fluxes, more variation between the models is seen which can partly be explained by the different methods in their calculation and partly by surface parameter values. For the sensible heat flux, simulation of wintertime values was the main problem, which also leads to issues in predicting near-surface stabilities particularly at the dense city centre site. All models have the most difficulties in simulating latent heat flux. This study particularly emphasizes that improvements are needed in the parameterization of anthropogenic heat flux and thermal parameters in winter, snow cover in spring and evapotranspiration in order to improve the surface energy balance modelling in cold climate cities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Universal Serial Bus (USB) Mass Storage Device (MSD), often termed a USB flash drive, is ubiquitously used to store important information in unencrypted binary format. This low cost consumer device is incredibly popular due to its size, large storage capacity and relatively high transfer speed. However, if the device is lost or stolen an unauthorized person can easily retrieve all the information. Therefore, it is advantageous in many applications to provide security protection so that only authorized users can access the stored information. In order to provide security protection for a USB MSD, this paper proposes a session key agreement protocol after secure user authentication. The main aim of this protocol is to establish session key negotiation through which all the information retrieved, stored and transferred to the USB MSD is encrypted. This paper not only contributes an efficient protocol, but also does not suffer from the forgery attack and the password guessing attack as compared to other protocols in the literature. This paper analyses the security of the proposed protocol through a formal analysis which proves that the information is stored confidentially and is protected offering strong resilience to relevant security attacks. The computational cost and communication cost of the proposed scheme is analyzed and compared to related work to show that the proposed scheme has an improved tradeoff for computational cost, communication cost and security.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sea surface temperature (SST) data are often provided as gridded products, typically at resolutions of order 0.05 degrees from satellite observations to reduce data volume at the request of data users and facilitate comparison against other products or models. Sampling uncertainty is introduced in gridded products where the full surface area of the ocean within a grid cell cannot be fully observed because of cloud cover. In this paper we parameterise uncertainties in SST as a function of the percentage of clear-sky pixels available and the SST variability in that subsample. This parameterisation is developed from Advanced Along Track Scanning Radiometer (AATSR) data, but is applicable to all gridded L3U SST products at resolutions of 0.05-0.1 degrees, irrespective of instrument and retrieval algorithm, provided that instrument noise propagated into the SST is accounted for. We also calculate the sampling uncertainty of ~0.04 K in Global Area Coverage (GAC) Advanced Very High Resolution Radiometer (AVHRR) products, using related methods.