894 resultados para Prediction of scholastic success


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article discusses the sources of competitive advantage in the interwar British radio industry. Specifically, it examines why sections of the industry that reaped substantial monopoly rents from the downstream value chain failed to dominate the industry. During the 1920s Marconi (which controlled the fundamental UK patents) had a key cost advantage, as had other members of the ‘Big Six’ electrical engineering firms which formed the BBC and were granted preferential royalties. Meanwhile the valve manufacturers' cartel was also able to extract high rents from set manufacturers. The vertical integration literature suggests that input monopolists have incentives to control downstream production. Yet—in contrast to the gramophone industry, which became concentrated into two huge companies following market saturation in the 1930s—radio retained a much more competitive structure. The Big Six failed to capitalize fully on their initial cost advantages owing to logistical weaknesses in supplying markets subject to rapid technical and design obsolescence. Subsequently, during the 1930s, marketing innovations are shown to have played a key role in allowing several independents to establish successful brands. This gave them sufficient scale to provide strong bargaining positions with input suppliers, negating most of their initial cost disadvantage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Some of the techniques used to model nitrogen (N) and phosphorus (P) discharges from a terrestrial catchment to an estuary are discussed and applied to the River Tamar and Tamar Estuary system in Southwest England, U.K. Data are presented for dissolved inorganic nutrient concentrations in the Tamar Estuary and compared with those from the contrasting, low turbidity and rapidly flushed Tweed Estuary in Northeast England. In the Tamar catchment, simulations showed that effluent nitrate loads for typical freshwater flows contributed less than 1% of the total N load. The effect of effluent inputs on ammonium loads was more significant (∼10%). Cattle, sheep and permanent grassland dominated the N catchment export, with diffuse-source N export greatly dominating that due to point sources. Cattle, sheep, permanent grassland and cereal crops generated the greatest rates of diffuse-source P export. This reflected the higher rates of P fertiliser applications to arable land and the susceptibility of bare, arable land to P export in wetter winter months. N and P export to the Tamar Estuary from human sewage was insignificant. Non-conservative behaviour of phosphate was particularly marked in the Tamar Estuary. Silicate concentrations were slightly less than conservative levels, whereas nitrate was essentially conservative. The coastal sea acted as a sink for these terrestrially derived nutrients. A pronounced sag in dissolved oxygen that was associated with strong nitrite and ammonium peaks occurred in the turbidity maximum region of the Tamar Estuary. Nutrient behaviour within the Tweed was very different. The low turbidity and rapid flushing ensured that nutrients there were essentially conservative, so that flushing of nutrients to the coastal zone from the river occurred with little estuarine modification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Matei et al. (Reports, 6 January 2012, p. 76) claim to show skillful multiyear predictions of the Atlantic Meridional Overturning Circulation (AMOC). However, these claims are not justified, primarily because the predictions of AMOC transport do not outperform simple reference forecasts based on climatological annual cycles. Accordingly, there is no justification for the “confident” prediction of a stable AMOC through 2014.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The formulation and performance of the Met Office visibility analysis and prediction system are described. The visibility diagnostic within the limited-area Unified Model is a function of humidity and a prognostic aerosol content. The aerosol model includes advection, industrial and general urban sources, plus boundary-layer mixing and removal by rain. The assimilation is a 3-dimensional variational scheme in which the visibility observation operator is a very nonlinear function of humidity, aerosol and temperature. A quality control scheme for visibility data is included. Visibility observations can give rise to humidity increments of significant magnitude compared with the direct impact of humidity observations. We present the results of sensitivity studies which show the contribution of different components of the system to improved skill in visibility forecasts. Visibility assimilation is most important within the first 6-12 hours of the forecast and for visibilities below 1 km, while modelling of aerosol sources and advection is important for slightly higher visibilities (1-5 km) and is still significant at longer forecast times

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ECMWF operational grid point model (with a resolution of 1.875° of latitude and longitude) and its limited area version (with a resolution of !0.47° of latitude and longitude) with boundary values from the global model have been used to study the simulation of the typhoon Tip. The fine-mesh model was capable of simulating the main structural features of the typhoon and predicting a fall in central pressure of 60 mb in 3 days. The structure of the forecast typhoon, with a warm core (maximum potential temperature anomaly 17 K). intense swirling wind (maximum 55 m s-1 at 850 mb) and spiralling precipitation patterns is characteristic of a tropical cyclone. Comparison with the lower resolution forecast shows that the horizontal resolution is a determining factor in predicting not only the structure and intensity but even the movement of these vortices. However, an accurate and refined initial analysis is considered to be a prerequisite for a correct forecast of this phenomenon.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A series of numerical models have been used to investigate the predictability of atmospheric blocking for an episode selected from FGGE Special Observing Period I. Level II-b FGGE data have been used in the experiment. The blocking took place over the North Atlantic region and is a very characteristic example of high winter blocking. It is found that the very high resolution models developed at ECMWF, in a remarkable way manage to predict the blocking event in great detail, even beyond 1 week. Although models with much less resolution manage to predict the blocking phenomenon as such, the actual evolution differs very much from the observed and consequently the practical value is substantially reduced. Wind observations from the geostationary satellites are shown to have a substantial impact on the forecast beyond 5 days, as well as an extension of the integration domain to the whole globe. Quasi-geostrophic baroclinic models and, even more, barotropic models, are totally inadequate to predict blocking except in its initial phase. The prediction experiment illustrates clearly that efforts which have gone into the improvement of numerical prediction models in the last decades have been worth while.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the fundamental questions in dynamical meteorology, and one of the basic objectives of GARP, is to determine the predictability of the atmosphere. In the early planning stage and preparation for GARP a number of theoretical and numerical studies were undertaken, indicating that there existed an inherent unpredictability in the atmosphere which even with the most ideal observing system would limit useful weather forecasting to 2-3 weeks.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The FunFOLD2 server is a new independent server that integrates our novel protein–ligand binding site and quality assessment protocols for the prediction of protein function (FN) from sequence via structure. Our guiding principles were, first, to provide a simple unified resource to make our function prediction software easily accessible to all via a simple web interface and, second, to produce integrated output for predictions that can be easily interpreted. The server provides a clean web interface so that results can be viewed on a single page and interpreted by non-experts at a glance. The output for the prediction is an image of the top predicted tertiary structure annotated to indicate putative ligand-binding site residues. The results page also includes a list of the most likely binding site residues and the types of predicted ligands and their frequencies in similar structures. The protein–ligand interactions can also be interactively visualized in 3D using the Jmol plug-in. The raw machine readable data are provided for developers, which comply with the Critical Assessment of Techniques for Protein Structure Prediction data standards for FN predictions. The FunFOLD2 webserver is freely available to all at the following web site: http://www.reading.ac.uk/bioinf/FunFOLD/FunFOLD_form_2_0.html.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present an efficient graph-based algorithm for quantifying the similarity of household-level energy use profiles, using a notion of similarity that allows for small time–shifts when comparing profiles. Experimental results on a real smart meter data set demonstrate that in cases of practical interest our technique is far faster than the existing method for computing the same similarity measure. Having a fast algorithm for measuring profile similarity improves the efficiency of tasks such as clustering of customers and cross-validation of forecasting methods using historical data. Furthermore, we apply a generalisation of our algorithm to produce substantially better household-level energy use forecasts from historical smart meter data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Historic geomagnetic activity observations have been used to reveal centennial variations in the open solar flux and the near-Earth heliospheric conditions (the interplanetary magnetic field and the solar wind speed). The various methods are in very good agreement for the past 135 years when there were sufficient reliable magnetic observatories in operation to eliminate problems due to site-specific errors and calibration drifts. This review underlines the physical principles that allow these reconstructions to be made, as well as the details of the various algorithms employed and the results obtained. Discussion is included of: the importance of the averaging timescale; the key differences between “range” and “interdiurnal variability” geomagnetic data; the need to distinguish source field sector structure from heliospherically-imposed field structure; the importance of ensuring that regressions used are statistically robust; and uncertainty analysis. The reconstructions are exceedingly useful as they provide calibration between the in-situ spacecraft measurements from the past five decades and the millennial records of heliospheric behaviour deduced from measured abundances of cosmogenic radionuclides found in terrestrial reservoirs. Continuity of open solar flux, using sunspot number to quantify the emergence rate, is the basis of a number of models that have been very successful in reproducing the variation derived from geomagnetic activity. These models allow us to extend the reconstructions back to before the development of the magnetometer and to cover the Maunder minimum. Allied to the radionuclide data, the models are revealing much about how the Sun and heliosphere behaved outside of grand solar maxima and are providing a means of predicting how solar activity is likely to evolve now that the recent grand maximum (that had prevailed throughout the space age) has come to an end.