33 resultados para Problems of Computer Intellectualization


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerical forecasts of the atmosphere based on the fundamental dynamical and thermodynamical equations have now been carried for almost 30 years. The very first models which were used were drastic simplifications of the governing equations and permitting only the prediction of the geostrophic wind in the middle of the troposphere based on the conservation of absolute vorticity. Since then we have seen a remarkable development in models predicting the large-scale synoptic flow. Verification carried out at NMC Washington indicates an improvement of about 40% in 24h forecasts for the 500mb geopotential since the end of the 1950’s. The most advanced models of today use the equations of motion in their more original form (i.e. primitive equations) which are better suited to predicting the atmosphere at low latitudes as well as small scale systems. The model which we have developed at the Centre, for instance, will be able to predict weather systems from a scale of 500-1000 km and a vertical extension of a few hundred millibars up to global weather systems extending through the whole depth of the atmosphere. With a grid resolution of 1.5 and 15 vertical levels and covering the whole globe it is possible to describe rather accurately the thermodynamical processes associated with cyclone development. It is further possible to incorporate sub-grid-scale processes such as radiation, exchange of sensible heat, release of latent heat etc. in order to predict the development of new weather systems and the decay of old ones. Later in this introduction I will exemplify this by showing some results of forecasts by the Centre’s model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In The Conduct of Inquiry in International Relations, Patrick Jackson situates methodologies in International Relations in relation to their underlying philosophical assumptions. One of his aims is to map International Relations debates in a way that ‘capture[s] current controversies’ (p. 40). This ambition is overstated: whilst Jackson’s typology is useful as a clarificatory tool, (re)classifying existing scholarship in International Relations is more problematic. One problem with Jackson’s approach is that he tends to run together the philosophical assumptions which decisively differentiate his methodologies (by stipulating a distinctive warrant for knowledge claims) and the explanatory strategies that are employed to generate such knowledge claims, suggesting that the latter are entailed by the former. In fact, the explanatory strategies which Jackson associates with each methodology reflect conventional practice in International Relations just as much as they reflect philosophical assumptions. This makes it more difficult to identify each methodology at work than Jackson implies. I illustrate this point through a critical analysis of Jackson’s controversial reclassification of Waltz as an analyticist, showing that whilst Jackson’s typology helps to expose inconsistencies in Waltz’s approach, it does not fully support the proposed reclassification. The conventional aspect of methodologies in International Relations also raises questions about the limits of Jackson’s ‘engaged pluralism’.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents recent research into the functions and value of sketch outputs during computer supported collaborative design. Sketches made primarily exploiting whiteboard technology are shown to support subjects engaged in remote collaborative design, particularly when constructed in ‘nearsynchronous’ communication. The authors define near-synchronous communication and speculate that it is compatible with the reflective and iterative nature of design activity. There appears to be significant similarities between the making of sketches in near-synchronous remote collaborative design and those made on paper in more traditional face-to-face settings With the current increase in the use of computer supported collaborative working (CSCW) in undergraduate and postgraduate design education it is proposed that sketches and sketching can make important contributions to design learning in this context

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present results on the growth of damage in 29 fatigue tests of human femoral cortical bone from four individuals, aged 53–79. In these tests we examine the interdependency of stress, cycles to failure, rate of creep strain, and rate of modulus loss. The behavior of creep rates has been reported recently for the same donors as an effect of stress and cycles (Cotton, J. R., Zioupos, P., Winwood, K., and Taylor, M., 2003, "Analysis of Creep Strain During Tensile Fatigue of Cortical Bone," J. Biomech. 36, pp. 943–949). In the present paper we first examine how the evolution of damage (drop in modulus per cycle) is associated with the stress level or the "normalized stress" level (stress divided by specimen modulus), and results show the rate of modulus loss fits better as a function of normalized stress. However, we find here that even better correlations can be established between either the cycles to failure or creep rates versus rates of damage than any of these three measures versus normalized stress. The data indicate that damage rates can be excellent predictors of fatigue life and creep strain rates in tensile fatigue of human cortical bone for use in practical problems and computer simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we consider bilinear forms of matrix polynomials and show that these polynomials can be used to construct solutions for the problems of solving systems of linear algebraic equations, matrix inversion and finding extremal eigenvalues. An almost Optimal Monte Carlo (MAO) algorithm for computing bilinear forms of matrix polynomials is presented. Results for the computational costs of a balanced algorithm for computing the bilinear form of a matrix power is presented, i.e., an algorithm for which probability and systematic errors are of the same order, and this is compared with the computational cost for a corresponding deterministic method.