847 resultados para Calculus in high school
Resumo:
The background error covariance matrix, B, is often used in variational data assimilation for numerical weather prediction as a static and hence poor approximation to the fully dynamic forecast error covariance matrix, Pf. In this paper the concept of an Ensemble Reduced Rank Kalman Filter (EnRRKF) is outlined. In the EnRRKF the forecast error statistics in a subspace defined by an ensemble of states forecast by the dynamic model are found. These statistics are merged in a formal way with the static statistics, which apply in the remainder of the space. The combined statistics may then be used in a variational data assimilation setting. It is hoped that the nonlinear error growth of small-scale weather systems will be accurately captured by the EnRRKF, to produce accurate analyses and ultimately improved forecasts of extreme events.
Resumo:
Biomass allocation to above- and belowground compartments in trees is thought to be affected by growth conditions. To assess the strength of such influences, we sampled six Norway spruce forest stands growing at higher altitudes. Within these stands, we randomly selected a total of 77 Norway spruce trees and measured volume and biomass of stem, above- and belowground stump and all roots over 0.5 cm diameter. A comparison of our observations with models parameterised for lower altitudes shows that models developed for specific conditions may be applicable to other locations. Using our observations, we developed biomass functions (BF) and biomass conversion and expansion factors (BCEF) linking belowground biomass to stem parameters. While both BF and BCEF are accurate in belowground biomass predictions, using BCEF appears more promising as such factors can be readily used with existing forest inventory data to obtain estimates of belowground biomass stock. As an example, we show how BF and BCEF developed for individual trees can be used to estimate belowground biomass at the stand level. In combination with existing aboveground models, our observations can be used to quantify total standing biomass of high altitude Norway spruce stands.
Resumo:
This study was undertaken to explore gel permeation chromatography (GPC) for estimating molecular weights of proanthocyanidin fractions isolated from sainfoin (Onobrychis viciifolia). The results were compared with data obtained by thiolytic degradation of the same fractions. Polystyrene, polyethylene glycol and polymethyl methacrylate standards were not suitable for estimating the molecular weights of underivatized proanthocyanidins. Therefore, a novel HPLC-GPC method was developed based on two serially connected PolarGel-L columns using DMF that contained 5% water, 1% acetic acid and 0.15 M LiBr at 0.7 ml/min and 50 degrees C. This yielded a single calibration curve for galloyl glucoses (trigalloyl glucose, pentagalloyl glucose), ellagitannins (pedunculagin, vescalagin, punicalagin, oenothein B, gemin A), proanthocyanidins (procyanidin B2, cinnamtannin B1), and several other polyphenols (catechin, epicatechin gallate, epicallocatechin gallate, amentoflavone). These GPC predicted molecular weights represented a considerable advance over previously reported HPLC-GPC methods for underivatized proanthocyanidins. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Active robot force control requires some form of dynamic inner loop control for stability. The author considers the implementation of position-based inner loop control on an industrial robot fitted with encoders only. It is shown that high gain velocity feedback for such a robot, which is effectively stationary when in contact with a stiff environment, involves problems beyond the usual caveats on the effects of unknown environment stiffness. It is shown that it is possible for the controlled joint to become chaotic at very low velocities if encoder edge timing data are used for velocity measurement. The results obtained indicate that there is a lower limit on controlled velocity when encoders are the only means of joint measurement. This lower limit to speed is determined by the desired amount of loop gain, which is itself determined by the severity of the nonlinearities present in the drive system.
Resumo:
In this article we describe recent progress on the design, analysis and implementation of hybrid numerical-asymptotic boundary integral methods for boundary value problems for the Helmholtz equation that model time harmonic acoustic wave scattering in domains exterior to impenetrable obstacles. These hybrid methods combine conventional piecewise polynomial approximations with high-frequency asymptotics to build basis functions suitable for representing the oscillatory solutions. They have the potential to solve scattering problems accurately in a computation time that is (almost) independent of frequency and this has been realized for many model problems. The design and analysis of this class of methods requires new results on the analysis and numerical analysis of highly oscillatory boundary integral operators and on the high-frequency asymptotics of scattering problems. The implementation requires the development of appropriate quadrature rules for highly oscillatory integrals. This article contains a historical account of the development of this currently very active field, a detailed account of recent progress and, in addition, a number of original research results on the design, analysis and implementation of these methods.
Resumo:
The final warming date of the polar vortex is a key component of Southern Hemisphere stratospheric and tropospheric variability in spring and summer. We examine the effect of external forcings on Southern Hemisphere final warming date, and the sensitivity of any projected changes to model representation of the stratosphere. Final warming date is calculated using a temperature-based diagnostic for ensembles of high- and low-top CMIP5 models, under the CMIP5 historical, RCP4.5, and RCP8.5 forcing scenarios. The final warming date in the models is generally too late in comparison with those from reanalyses: around two weeks too late in the low-top ensemble, and around one week too late in the high-top ensemble. Ensemble Empirical Mode Decomposition (EEMD) is used to analyse past and future change in final warming date. Both the low- and high-top ensemble show characteristic behaviour expected in response to changes in greenhouse gas and stratospheric ozone concentrations. In both ensembles, under both scenarios, an increase in final warming date is seen between 1850 and 2100, with the latest dates occurring in the early twenty-first century, associated with the minimum in stratospheric ozone concentrations in this period. However, this response is more pronounced in the high-top ensemble. The high-top models show a delay in final warming date in RCP8.5 that is not produced by the low-top models, which are shown to be less responsive to greenhouse gas forcing. This suggests that it may be necessary to use stratosphere resolving models to accurately predict Southern Hemisphere surface climate change.
Resumo:
Since the banking crisis of 2008 the global economy is perceived as riskier than before. Firms that cannot manage risks have withdrawn from countries in which they previously invested. These problems are not new. For centuries firms have invested in risky foreign environments, and many of them have succeeded. This paper reviews the risk management strategies of foreign investors. Using archival evidence and secondary sources it distinguishes the different types of risks that investors face and the different strategies by which risks can be managed. It investigates which strategies are used to manage which types of risk.
Resumo:
Amid a worldwide increase in tree mortality, mountain pine beetles (Dendroctonus ponderosae Hopkins) have led to the death of billions of trees from Mexico to Alaska since 2000. This is predicted to have important carbon, water and energy balance feedbacks on the Earth system. Counter to current projections, we show that on a decadal scale, tree mortality causes no increase in ecosystem respiration from scales of several square metres up to an 84 km2 valley. Rather, we found comparable declines in both gross primary productivity and respiration suggesting little change in net flux, with a transitory recovery of respiration 6–7 years after mortality associated with increased incorporation of leaf litter C into soil organic matter, followed by further decline in years 8–10. The mechanism of the impact of tree mortality caused by these biotic disturbances is consistent with reduced input rather than increased output of carbon.
Resumo:
Flooding is a particular hazard in urban areas worldwide due to the increased risks to life and property in these regions. Synthetic Aperture Radar (SAR) sensors are often used to image flooding because of their all-weather day-night capability, and now possess sufficient resolution to image urban flooding. The flood extents extracted from the images may be used for flood relief management and improved urban flood inundation modelling. A difficulty with using SAR for urban flood detection is that, due to its side-looking nature, substantial areas of urban ground surface may not be visible to the SAR due to radar layover and shadow caused by buildings and taller vegetation. This paper investigates whether urban flooding can be detected in layover regions (where flooding may not normally be apparent) using double scattering between the (possibly flooded) ground surface and the walls of adjacent buildings. The method estimates double scattering strengths using a SAR image in conjunction with a high resolution LiDAR (Light Detection and Ranging) height map of the urban area. A SAR simulator is applied to the LiDAR data to generate maps of layover and shadow, and estimate the positions of double scattering curves in the SAR image. Observations of double scattering strengths were compared to the predictions from an electromagnetic scattering model, for both the case of a single image containing flooding, and a change detection case in which the flooded image was compared to an un-flooded image of the same area acquired with the same radar parameters. The method proved successful in detecting double scattering due to flooding in the single-image case, for which flooded double scattering curves were detected with 100% classification accuracy (albeit using a small sample set) and un-flooded curves with 91% classification accuracy. The same measures of success were achieved using change detection between flooded and un-flooded images. Depending on the particular flooding situation, the method could lead to improved detection of flooding in urban areas.
Resumo:
Objectives To investigate whether sleep disturbances previously found to characterise high risk infants: (a) persist into childhood; (b) are influenced by early maternal settling strategies and (c) predict cognitive and emotional/behavioural functioning. Methods Mothers experiencing high and low levels of psychosocial adversity (risk) were recruited antenatally and longitudinally assessed with their children. Mothers completed measures of settling strategies and infant sleep postnatally, and at 12 and 18 months, infant age. At five years, child sleep characteristics were measured via an actigraphy and maternal report; IQ and child adjustment were also assessed. Results Sleep disturbances observed in high-risk infants persisted at five years. Maternal involvement in infant settling was greater in high risk mothers, and predicted less optimal sleep at five years. Poorer five year sleep was associated with concurrent child anxiety/depression and aggression, but there was limited evidence for an influence of early sleep problems. Associations between infant/child sleep characteristics and IQ were also limited. Conclusions Early maternal over-involvement in infant settling is associated with less optimal sleep in children, which in turn, is related to child adjustment. The findings highlight the importance of supporting parents in the early development of good settling practices, particularly in high-risk populations.
Resumo:
The global characteristics of tropical cyclones (TCs) simulated by several climate models are analyzed and compared with observations. The global climate models were forced by the same sea surface temperature (SST) fields in two types of experiments, using climatological SST and interannually varying SST. TC tracks and intensities are derived from each model's output fields by the group who ran that model, using their own preferred tracking scheme; the study considers the combination of model and tracking scheme as a single modeling system, and compares the properties derived from the different systems. Overall, the observed geographic distribution of global TC frequency was reasonably well reproduced. As expected, with the exception of one model, intensities of the simulated TC were lower than in observations, to a degree that varies considerably across models.
Resumo:
A new frontier in weather forecasting is emerging by operational forecast models now being run at convection-permitting resolutions at many national weather services. However, this is not a panacea; significant systematic errors remain in the character of convective storms and rainfall distributions. The DYMECS project (Dynamical and Microphysical Evolution of Convective Storms) is taking a fundamentally new approach to evaluate and improve such models: rather than relying on a limited number of cases, which may not be representative, we have gathered a large database of 3D storm structures on 40 convective days using the Chilbolton radar in southern England. We have related these structures to storm life-cycles derived by tracking features in the rainfall from the UK radar network, and compared them statistically to storm structures in the Met Office model, which we ran at horizontal grid length between 1.5 km and 100 m, including simulations with different subgrid mixing length. We also evaluated the scale and intensity of convective updrafts using a new radar technique. We find that the horizontal size of simulated convective storms and the updrafts within them is much too large at 1.5-km resolution, such that the convective mass flux of individual updrafts can be too large by an order of magnitude. The scale of precipitation cores and updrafts decreases steadily with decreasing grid lengths, as does the typical storm lifetime. The 200-m grid-length simulation with standard mixing length performs best over all diagnostics, although a greater mixing length improves the representation of deep convective storms.
Resumo:
Cruciferous-rich diets have been associated with reduction in plasma LDL-cholesterol (LDL-C), which may be due to the action of isothiocyanates derived from glucosinolates that accumulate in these vegetables. This study tests the hypothesis that a diet rich in high glucoraphanin (HG) broccoli will reduce plasma LDL-C. METHODS AND RESULTS: One hundred and thirty volunteers were recruited to two independent double-blind, randomly allocated parallel dietary intervention studies, and were assigned to consume either 400 g standard broccoli or 400 g HG broccoli per week for 12 weeks. Plasma lipids were quantified before and after the intervention. In study 1 (37 volunteers), the HG broccoli diet reduced plasma LDL-C by 7.1% (95% CI: -1.8%, -12.3%, p = 0.011), whereas standard broccoli reduced LDL-C by 1.8% (95% CI +3.9%, -7.5%, ns). In study 2 (93 volunteers), the HG broccoli diet resulted in a reduction of 5.1% (95% CI: -2.1%, -8.1%, p = 0.001), whereas standard broccoli reduced LDL-C by 2.5% (95% CI: +0.8%, -5.7%, ns). When data from the two studies were combined the reduction in LDL-C by the HG broccoli was significantly greater than standard broccoli (p = 0.031). CONCLUSION: Evidence from two independent human studies indicates that consumption of high glucoraphanin broccoli significantly reduces plasma LDL-C