47 resultados para Lower level relaxation
Resumo:
Recent studies have found contradicting results on whether tropical atmospheric circulation (TAC) has intensified or weakened in recent decades. We here re-investigate recent changes in TAC derived from moisture transports into the tropics using high temporal and spatial resolution reanalyses from ERA-interim. We found a significant strengthening of both, the lower level inward transports and the mid level outward transports over the recent two decades. However the signal in the total budget is weak, because strengthening of the in and outflow neutralize each other, at least to some extent. We found atmospheric humidity to be relatively stable, so suggest that the intensification is mainly caused by an intensification of the wind related circulation strength. The exact quantitative values were found to heavily depend on whether the calculations are based on mean or instantaneous values. We highlight the importance for using the instantaneous ones for transport calculations, as they represent the coincidence of high wind speeds and high atmospheric humidity.
Resumo:
In terrestrial television transmission multiple paths of various lengths can occur between the transmitter and the receiver. Such paths occur because of reflections from objects outside the direct transmission path. The multipath signals arriving at the receiver are all detected along with the intended signal causing time displaced replicas called 'ghosts' to appear on the television picture. With an increasing number of people living within built up areas, ghosting is becoming commonplace and therefore deghosting is becoming increasingly important. This thesis uses a deterministic time domain approach to deghosting, resulting in a simple solution to the problem of removing ghosts. A new video detector is presented which reduces the synchronous detector local oscillator phase error, caused by any practical size of ghost, to a lower level than has ever previously been achieved. From the new detector, dispersion of the video signal is minimised and a known closed-form time domain description of the individual ghost components within the detected video is subsequently obtained. Developed from mathematical descriptions of the detected video, a new specific deghoster filter structure is presented which is capable of removing both inphase (I) and also the phase quadrature (Q) induced ghost signals derived from the VSB operation. The new deghoster filter requires much less hardware than any previous deghoster which is capable of removing both I and Q ghost components. A new channel identification algorithm was also required and written which is based upon simple correlation techniques to find the delay and complex amplitude characteristics of individual ghosts. The result of the channel identification is then passed to the new I and Q deghoster filter for ghost cancellation. Generated from the research work performed for this thesis, five papers have been published. D
Resumo:
The prediction of Northern Hemisphere (NH) extratropical cyclones by nine different ensemble prediction systems(EPSs), archived as part of The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE), has recently been explored using a cyclone tracking approach. This paper provides a continuation of this work, extending the analysis to the Southern Hemisphere (SH). While the EPSs have larger error in all cyclone properties in the SH, the relative performance of the different EPSs remains broadly consistent between the two hemispheres. Some interesting differences are also shown. The Chinese Meteorological Administration (CMA) EPS has a significantly lower level of performance in the SH compared to the NH. Previous NH results showed that the Centro de Previsao de Tempo e Estudos Climaticos (CPTEC) EPS underpredicts cyclone intensity. The results of this current study show that this bias is significantly larger in the SH. The CPTEC EPS also has very little spread in both hemispheres. As with the NH results, cyclone propagation speed is underpredicted by all the EPSs in the SH. To investigate this further, the bias was also computed for theECMWFhigh-resolution deterministic forecast. The bias was significantly smaller than the lower resolution ECMWF EPS.
Resumo:
Background: Intrusions are common symptoms of both posttraumatic stress disorder (PTSD) and schizophrenia. Steel et al (2005) suggest that an information processing style characterized by weak trait contextual integration renders psychotic individuals vulnerable to intrusive experiences. This ‘contextual integration hypothesis’ was tested in individuals reporting anomalous experiences in the absence of a need-for-care. Methods: Twenty-six low schizotypes and twenty-three individuals reporting anomalous experiences were shown a traumatic film with and without a concurrent visuo-spatial task. Participants rated post-traumatic intrusions for frequency and form, and completed self-report measures of information processing style. It was predicted that, due to their weaker trait contextual integration, the anomalous experiences group would (1) exhibit more intrusions following exposure to the trauma-film; (2) display intrusions characterised by more PTSD qualities and (3) show a greater reduction of intrusions with the concurrent visuo-spatial task. Results: As predicted, the anomalous experiences group reported a lower level of trait contextual integration and more intrusions than the low schizotypes, both immediately after watching the film, and during the following seven days. Their post-traumatic intrusive memories were more PTSD-like (more intrusive, vivid and associated with emotion). The visuo-spatial task had no effect on number of intrusions in either group. Conclusions: These findings provide some support for the proposal that weak trait contextual integration underlies the development of intrusions within both PTSD and psychosis.
Resumo:
In this paper we propose an efficient two-level model identification method for a large class of linear-in-the-parameters models from the observational data. A new elastic net orthogonal forward regression (ENOFR) algorithm is employed at the lower level to carry out simultaneous model selection and elastic net parameter estimation. The two regularization parameters in the elastic net are optimized using a particle swarm optimization (PSO) algorithm at the upper level by minimizing the leave one out (LOO) mean square error (LOOMSE). Illustrative examples are included to demonstrate the effectiveness of the new approaches.
Resumo:
Results from an idealized three-dimensional baroclinic life-cycle model are interpreted in a potential vorticity (PV) framework to identify the physical mechanisms by which frictional processes acting in the atmospheric boundary layer modify and reduce the baroclinic development of a midlatitude storm. Considering a life cycle where the only non-conservative process acting is boundary-layer friction, the rate of change of depth-averaged PV within the boundary layer is governed by frictional generation of PV and the flux of PV into the free troposphere. Frictional generation of PV has two contributions: Ekman generation, which is directly analogous to the well-known Ekman-pumping mechanism for barotropic vortices, and baroclinic generation, which depends on the turning of the wind in the boundary layer and low-level horizontal temperature gradients. It is usually assumed, at least implicitly, that an Ekman process of negative PV generation is the mechanism whereby friction reduces the strength and growth rates of baroclinic systems. Although there is evidence for this mechanism, it is shown that baroclinic generation of PV dominates, producing positive PV anomalies downstream of the low centre, close to developing warm and cold fronts. These PV anomalies are advected by the large-scale warm conveyor belt flow upwards and polewards, fluxed into the troposphere near the warm front, and then advected westwards relative to the system. The result is a thin band of positive PV in the lower troposphere above the surface low centre. This PV is shown to be associated with a positive static stability anomaly, which Rossby edge wave theory suggests reduces the strength of the coupling between the upper- and lower-level PV anomalies, thereby reducing the rate of baroclinic development. This mechanism, which is a result of the baroclinic dynamics in the frontal regions, is in marked contrast with simple barotropic spin-down ideas. Finally we note the implications of these frictionally generated PV anomalies for cyclone forecasting.
Resumo:
Cyclodextrins are water-soluble cyclic oligosaccharides consisting of six, seven, and eight α-(1,4)-linked glucopyranose subunits. This study reports the use of different cyclodextrins in eye drop formulations to improve the aqueous solubility and corneal permeability of riboflavin. Riboflavin is a poorly soluble drug with a solubility up to 0.08 mg mL–1 in deionized water. It is used as a drug topically administered to the eye to mediate UV-induced corneal cross-linking in the treatment of keratoconus. Aqueous solutions of β-cyclodextrin (10–30 mg mL–1) can enhance the solubility of riboflavin up to 0.12–0.19 mg mL–1, whereas the higher concentration of α-cyclodextrin (100 mg mL–1) achieved a lower level of enhancement of 0.11 mg mL–1. The other oligosaccharides were found to be inefficient for this purpose. In vitro diffusion experiments performed with fresh and cryopreserved bovine cornea have demonstrated that β-cyclodextrin enhances riboflavin permeability. The mechanism of this enhancement was examined through microscopic histological analysis of the cornea and is discussed in this paper.
Resumo:
A two-stage linear-in-the-parameter model construction algorithm is proposed aimed at noisy two-class classification problems. The purpose of the first stage is to produce a prefiltered signal that is used as the desired output for the second stage which constructs a sparse linear-in-the-parameter classifier. The prefiltering stage is a two-level process aimed at maximizing a model's generalization capability, in which a new elastic-net model identification algorithm using singular value decomposition is employed at the lower level, and then, two regularization parameters are optimized using a particle-swarm-optimization algorithm at the upper level by minimizing the leave-one-out (LOO) misclassification rate. It is shown that the LOO misclassification rate based on the resultant prefiltered signal can be analytically computed without splitting the data set, and the associated computational cost is minimal due to orthogonality. The second stage of sparse classifier construction is based on orthogonal forward regression with the D-optimality algorithm. Extensive simulations of this approach for noisy data sets illustrate the competitiveness of this approach to classification of noisy data problems.
Resumo:
Considerable progress has taken place in numerical weather prediction over the last decade. It has been possible to extend predictive skills in the extra-tropics of the Northern Hemisphere during the winter from less than five days to seven days. Similar improvements, albeit on a lower level, have taken place in the Southern Hemisphere. Another example of improvement in the forecasts is the prediction of intense synoptic phenomena such as cyclogenesis which on the whole is quite successful with the most advanced operational models (Bengtsson (1989), Gadd and Kruze (1988)). A careful examination shows that there are no single causes for the improvements in predictive skill, but instead they are due to several different factors encompassing the forecasting system as a whole (Bengtsson, 1985). In this paper we will focus our attention on the role of data-assimilation and the effect it may have on reducing the initial error and hence improving the forecast. The first part of the paper contains a theoretical discussion on error growth in simple data assimilation systems, following Leith (1983). In the second part we will apply the result on actual forecast data from ECMWF. The potential for further forecast improvements within the framework of the present observing system in the two hemispheres will be discussed.
Resumo:
A novel two-stage construction algorithm for linear-in-the-parameters classifier is proposed, aiming at noisy two-class classification problems. The purpose of the first stage is to produce a prefiltered signal that is used as the desired output for the second stage to construct a sparse linear-in-the-parameters classifier. For the first stage learning of generating the prefiltered signal, a two-level algorithm is introduced to maximise the model's generalisation capability, in which an elastic net model identification algorithm using singular value decomposition is employed at the lower level while the two regularisation parameters are selected by maximising the Bayesian evidence using a particle swarm optimization algorithm. Analysis is provided to demonstrate how “Occam's razor” is embodied in this approach. The second stage of sparse classifier construction is based on an orthogonal forward regression with the D-optimality algorithm. Extensive experimental results demonstrate that the proposed approach is effective and yields competitive results for noisy data sets.
Resumo:
An efficient two-level model identification method aiming at maximising a model׳s generalisation capability is proposed for a large class of linear-in-the-parameters models from the observational data. A new elastic net orthogonal forward regression (ENOFR) algorithm is employed at the lower level to carry out simultaneous model selection and elastic net parameter estimation. The two regularisation parameters in the elastic net are optimised using a particle swarm optimisation (PSO) algorithm at the upper level by minimising the leave one out (LOO) mean square error (LOOMSE). There are two elements of original contributions. Firstly an elastic net cost function is defined and applied based on orthogonal decomposition, which facilitates the automatic model structure selection process with no need of using a predetermined error tolerance to terminate the forward selection process. Secondly it is shown that the LOOMSE based on the resultant ENOFR models can be analytically computed without actually splitting the data set, and the associate computation cost is small due to the ENOFR procedure. Consequently a fully automated procedure is achieved without resort to any other validation data set for iterative model evaluation. Illustrative examples are included to demonstrate the effectiveness of the new approaches.
Resumo:
Air frying is being projected as an alternative to deep fat frying for producing snacks such as French Fries. In air frying, the raw potato sections are essentially heated in hot air containing fine oil droplets, which dehydrates the potato and attempts to impart the characteristics of traditionally produced French fries, but with a substantially lower level of fat absorbed in the product. The aim of this research is to compare: 1) the process dynamics of air frying with conventional deep fat frying under otherwise similar operating conditions, and 2) the products formed by the two processes in terms of color, texture, microstructure, calorimetric properties and sensory characteristics Although, air frying produced products with a substantially lower fat content but with similar moisture contents and color characteristics, it required much longer processing times, typically 21 minutes in relation to 9 minutes in the case of deep fat frying. The slower evolution of temperature also resulted in lower rates of moisture loss and color development reactions. DSC studies revealed that the extent of starch gelatinization was also lower in the case of air fried product. In addition, the two types of frying also resulted in products having significantly different texture and sensory characteristics.
Resumo:
The challenge of moving past the classic Window Icons Menus Pointer (WIMP) interface, i.e. by turning it ‘3D’, has resulted in much research and development. To evaluate the impact of 3D on the ‘finding a target picture in a folder’ task, we built a 3D WIMP interface that allowed the systematic manipulation of visual depth, visual aides, semantic category distribution of targets versus non-targets; and the detailed measurement of lower-level stimuli features. Across two separate experiments, one large sample web-based experiment, to understand associations, and one controlled lab environment, using eye tracking to understand user focus, we investigated how visual depth, use of visual aides, use of semantic categories, and lower-level stimuli features (i.e. contrast, colour and luminance) impact how successfully participants are able to search for, and detect, the target image. Moreover in the lab-based experiment, we captured pupillometry measurements to allow consideration of the influence of increasing cognitive load as a result of either an increasing number of items on the screen, or due to the inclusion of visual depth. Our findings showed that increasing the visible layers of depth, and inclusion of converging lines, did not impact target detection times, errors, or failure rates. Low-level features, including colour, luminance, and number of edges, did correlate with differences in target detection times, errors, and failure rates. Our results also revealed that semantic sorting algorithms significantly decreased target detection times. Increased semantic contrasts between a target and its neighbours correlated with an increase in detection errors. Finally, pupillometric data did not provide evidence of any correlation between the number of visible layers of depth and pupil size, however, using structural equation modelling, we demonstrated that cognitive load does influence detection failure rates when there is luminance contrasts between the target and its surrounding neighbours. Results suggest that WIMP interaction designers should consider stimulus-driven factors, which were shown to influence the efficiency with which a target icon can be found in a 3D WIMP interface.
Resumo:
Water table response to rainfall was investigated at six sites in the Upper, Middle and Lower Chalk of southern England. Daily time series of rainfall and borehole water level were cross-corretated to investigate seasonal variations in groundwater-level response times, based on periods of 3-month duration. The time tags (in days) yielding significant correlations were compared with the average unsaturated zone thickness during each 3-month period. In general, for cases when the unsaturated zone was greater than 18 m thick, the time tag for a significant water-level response increased rapidly once the depth to the water table exceeded a critical value, which varied from site to site. For shallower water tables, a linear relationship between the depth to the water table and the water-level response time was evident. The observed variations in response time can only be partially accounted for using a diffusive model for propagation through the unsaturated matrix, suggesting that some fissure flow was occurring. The majority of rapid responses were observed during the winter/spring recharge period, when the unsaturated zone is thinnest and the unsaturated zone moisture content is highest, and were more likely to occur when the rainfall intensity exceeded 5 mm/day. At some sites, a very rapid response within 24 h of rainfall was observed in addition to the longer term responses even when the unsaturated zone was up to 64 m thick. This response was generally associated with the autumn period. The results of the cross-correlation analysis provide statistical support for the presence of fissure flow and for the contribution of multiple pathways through the unsaturated zone to groundwater recharge. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
The paper explores the impact of insect-resistant Bacillus thuringiensis (Bt) cotton on costs and returns over the first two seasons of its commercial release in three sub-regions of Maharashtra State, India. It is the first such research conducted in India based on farmers' own practices rather than trial plots. Data were collected for a total of 7793 cotton plots in 2002 and 1577 plots in 2003. Results suggest that while the cost of cotton seed was much higher for farmers growing Bt cotton relative to those growing non-Bt cotton, the costs of bollworm spray were much lower. While Bt plots had greater costs (seed plus insecticide) than non-Bt plots, the yields and revenue from Bt plots were much higher than those of non-Bt plots (some 39% and 63% higher in 2002 and 2003, respectively). Overall, the gross margins of Bt plots were some 43% (2002) and 73% (2003) higher than those of non-Bt plots, although there was some variation between the three sub-regions of the state. The results suggest that Bt cotton has provided substantial benefits for farmers in India over the 2 years, but there are questions as to whether these benefits are sustainable. (c) 2004 Elsevier Ltd. All rights reserved.