182 resultados para standard batch algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The A-Train constellation of satellites provides a new capability to measure vertical cloud profiles that leads to more detailed information on ice-cloud microphysical properties than has been possible up to now. A variational radar–lidar ice-cloud retrieval algorithm (VarCloud) takes advantage of the complementary nature of the CloudSat radar and Cloud–Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) lidar to provide a seamless retrieval of ice water content, effective radius, and extinction coefficient from the thinnest cirrus (seen only by the lidar) to the thickest ice cloud (penetrated only by the radar). In this paper, several versions of the VarCloud retrieval are compared with the CloudSat standard ice-only retrieval of ice water content, two empirical formulas that derive ice water content from radar reflectivity and temperature, and retrievals of vertically integrated properties from the Moderate Resolution Imaging Spectroradiometer (MODIS) radiometer. The retrieved variables typically agree to within a factor of 2, on average, and most of the differences can be explained by the different microphysical assumptions. For example, the ice water content comparison illustrates the sensitivity of the retrievals to assumed ice particle shape. If ice particles are modeled as oblate spheroids rather than spheres for radar scattering then the retrieved ice water content is reduced by on average 50% in clouds with a reflectivity factor larger than 0 dBZ. VarCloud retrieves optical depths that are on average a factor-of-2 lower than those from MODIS, which can be explained by the different assumptions on particle mass and area; if VarCloud mimics the MODIS assumptions then better agreement is found in effective radius and optical depth is overestimated. MODIS predicts the mean vertically integrated ice water content to be around a factor-of-3 lower than that from VarCloud for the same retrievals, however, because the MODIS algorithm assumes that its retrieved effective radius (which is mostly representative of cloud top) is constant throughout the depth of the cloud. These comparisons highlight the need to refine microphysical assumptions in all retrieval algorithms and also for future studies to compare not only the mean values but also the full probability density function.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Classical measures of network connectivity are the number of disjoint paths between a pair of nodes and the size of a minimum cut. For standard graphs, these measures can be computed efficiently using network flow techniques. However, in the Internet on the level of autonomous systems (ASs), referred to as AS-level Internet, routing policies impose restrictions on the paths that traffic can take in the network. These restrictions can be captured by the valley-free path model, which assumes a special directed graph model in which edge types represent relationships between ASs. We consider the adaptation of the classical connectivity measures to the valley-free path model, where it is -hard to compute them. Our first main contribution consists of presenting algorithms for the computation of disjoint paths, and minimum cuts, in the valley-free path model. These algorithms are useful for ASs that want to evaluate different options for selecting upstream providers to improve the robustness of their connection to the Internet. Our second main contribution is an experimental evaluation of our algorithms on four types of directed graph models of the AS-level Internet produced by different inference algorithms. Most importantly, the evaluation shows that our algorithms are able to compute optimal solutions to instances of realistic size of the connectivity problems in the valley-free path model in reasonable time. Furthermore, our experimental results provide information about the characteristics of the directed graph models of the AS-level Internet produced by different inference algorithms. It turns out that (i) we can quantify the difference between the undirected AS-level topology and the directed graph models with respect to fundamental connectivity measures, and (ii) the different inference algorithms yield topologies that are similar with respect to connectivity and are different with respect to the types of paths that exist between pairs of ASs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new database of weather and circulation type catalogs is presented comprising 17 automated classification methods and five subjective classifications. It was compiled within COST Action 733 "Harmonisation and Applications of Weather Type Classifications for European regions" in order to evaluate different methods for weather and circulation type classification. This paper gives a technical description of the included methods using a new conceptual categorization for classification methods reflecting the strategy for the definition of types. Methods using predefined types include manual and threshold based classifications while methods producing types derived from the input data include those based on eigenvector techniques, leader algorithms and optimization algorithms. In order to allow direct comparisons between the methods, the circulation input data and the methods' configuration were harmonized for producing a subset of standard catalogs of the automated methods. The harmonization includes the data source, the climatic parameters used, the classification period as well as the spatial domain and the number of types. Frequency based characteristics of the resulting catalogs are presented, including variation of class sizes, persistence, seasonal and inter-annual variability as well as trends of the annual frequency time series. The methodological concept of the classifications is partly reflected by these properties of the resulting catalogs. It is shown that the types of subjective classifications compared to automated methods show higher persistence, inter-annual variation and long-term trends. Among the automated classifications optimization methods show a tendency for longer persistence and higher seasonal variation. However, it is also concluded that the distance metric used and the data preprocessing play at least an equally important role for the properties of the resulting classification compared to the algorithm used for type definition and assignment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper uses the large-scale Cranet data to explore the extent of non-standard working time (NSWT) across Europe and to highlight the contrasts and similarities between two different varieties of capitalism (coordinated market economies and liberal market economies). We explore variations in the extent of different forms of NSWT (overtime, shift working and weekend working) within these two different forms of capitalism, controlling for firm size, sector and the extent of employee voice. Overall, there was no strong link between the variety of capitalism and the use of overtime and weekend working though shift working showed a clear distinction between the two varieties of capitalism. Usage of NSWT in some service sectors was particularly high under both forms of capitalism and service sector activities had a particularly marked influence on the use of overtime in liberal market economies. Surprisingly, strong employee voice was associated with greater use of NSWT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examines the numerical accuracy, computational cost, and memory requirements of self-consistent field theory (SCFT) calculations when the diffusion equations are solved with various pseudo-spectral methods and the mean field equations are iterated with Anderson mixing. The different methods are tested on the triply-periodic gyroid and spherical phases of a diblock-copolymer melt over a range of intermediate segregations. Anderson mixing is found to be somewhat less effective than when combined with the full-spectral method, but it nevertheless functions admirably well provided that a large number of histories is used. Of the different pseudo-spectral algorithms, the 4th-order one of Ranjan, Qin and Morse performs best, although not quite as efficiently as the full-spectral method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports the findings of a small-scale research project which investigated the levels of awareness and knowledge of written standard English of 10 and 11 year old children in two English primary schools. The project involved repeating in 2010 a written questionnaire previously used with children in the same schools in three separate surveys in 1999, 2002 and 2005. Data from the latest survey are compared to those from the previous three. The analysis seeks to identify any changes over time in children’s ability to recognise non-standard forms and supply standard English alternatives, as well as their ability to use technical terms related to language variation. Differences between the performance of boys and girls and that of the two schools are also analysed. The paper concludes that the socio-economic context of the schools may be a more important factor than gender in variations over time identified in the data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deep Brain Stimulation (DBS) is a treatment routinely used to alleviate the symptoms of Parkinson's disease (PD). In this type of treatment, electrical pulses are applied through electrodes implanted into the basal ganglia of the patient. As the symptoms are not permanent in most patients, it is desirable to develop an on-demand stimulator, applying pulses only when onset of the symptoms is detected. This study evaluates a feature set created for the detection of tremor - a cardinal symptom of PD. The designed feature set was based on standard signal features and researched properties of the electrical signals recorded from subthalamic nucleus (STN) within the basal ganglia, which together included temporal, spectral, statistical, autocorrelation and fractal properties. The most characterized tremor related features were selected using statistical testing and backward algorithms then used for classification on unseen patient signals. The spectral features were among the most efficient at detecting tremor, notably spectral bands 3.5-5.5 Hz and 0-1 Hz proved to be highly significant. The classification results for determination of tremor achieved 94% sensitivity with specificity equaling one.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recent increase in short messaging system (SMS) text messaging, often using abbreviated, non-conventional ‘textisms’ (e.g. ‘2nite’), in school-aged children has raised fears of negative consequences of such technology for literacy. The current research used a paradigm developed by Dixon and Kaminska, who showed that exposure to phonetically plausible misspellings (e.g. ‘recieve’) negatively affected subsequent spelling performance, though this was true only with adults, not children. The current research extends this work to directly investigate the effects of exposure to textisms, misspellings and correctly spelledwords on adults’ spelling. Spelling of a set of key words was assessed both before and after an exposure phase where participants read the same key words, presented either as textisms (e.g. ‘2nite’), correctly spelled (e.g. ‘tonight’) or misspelled (e.g. 'tonite’)words. Analysis showed that scores decreased from pre- to post-test following exposure to misspellings, whereas performance improved following exposure to correctly spelled words and, interestingly, to textisms. Data suggest that exposure to textisms, unlike misspellings, had a positive effect on adults’ spelling. These findings are interpreted in light of other recent research suggesting a positive relationship between texting and some literacy measures in school-aged children.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trends in China's energy future will have considerable consequences for both China and the global environment. Though China's carbon emissions are low on a per capita basis, China is already ranked the world's second largest producer of carbon, behind only America. China's buildings sector currently accounts for 23% of China's total energy use and is projected to increase to one-third by 2010. Energy policy plays an important role in China's sustainable development. The purpose of this study is to provide a broad overview of energy efficiency issues in the built environment in China. This paper, firstly briefly, reviews the key national policies related to the built environment and demonstrates the government's environmental concern. Secondly, the authors introduce recent energy policies in the built environment. Energy efficiency and renewable energy in the built environment, which are the key issues of the national energy policy, have been reviewed. Discussion of the implementation of energy policy has been carried out.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Controllers for feedback substitution schemes demonstrate a trade-off between noise power gain and normalized response time. Using as an example the design of a controller for a radiometric transduction process subjected to arbitrary noise power gain and robustness constraints, a Pareto-front of optimal controller solutions fulfilling a range of time-domain design objectives can be derived. In this work, we consider designs using a loop shaping design procedure (LSDP). The approach uses linear matrix inequalities to specify a range of objectives and a genetic algorithm (GA) to perform a multi-objective optimization for the controller weights (MOGA). A clonal selection algorithm is used to further provide a directed search of the GA towards the Pareto front. We demonstrate that with the proposed methodology, it is possible to design higher order controllers with superior performance in terms of response time, noise power gain and robustness.