206 resultados para Constrained Riemann problem
Resumo:
We consider the Dirichlet boundary value problem for the Helmholtz equation in a non-locally perturbed half-plane, this problem arising in electromagnetic scattering by one-dimensional rough, perfectly conducting surfaces. We propose a new boundary integral equation formulation for this problem, utilizing the Green's function for an impedance half-plane in place of the standard fundamental solution. We show, at least for surfaces not differing too much from the flat boundary, that the integral equation is uniquely solvable in the space of bounded and continuous functions, and hence that, for a variety of incident fields including an incident plane wave, the boundary value problem for the scattered field has a unique solution satisfying the limiting absorption principle. Finally, a result of continuous dependence of the solution on the boundary shape is obtained.
Resumo:
The problem of symmetric stability is examined within the context of the direct Liapunov method. The sufficient conditions for stability derived by Fjørtoft are shown to imply finite-amplitude, normed stability. This finite-amplitude stability theorem is then used to obtain rigorous upper bounds on the saturation amplitude of disturbances to symmetrically unstable flows.By employing a virial functional, the necessary conditions for instability implied by the stability theorem are shown to be in fact sufficient for instability. The results of Ooyama are improved upon insofar as a tight two-sided (upper and lower) estimate is obtained of the growth rate of (modal or nonmodal) symmetric instabilities.The case of moist adiabatic systems is also considered.
Resumo:
The concept of slow vortical dynamics and its role in theoretical understanding is central to geophysical fluid dynamics. It leads, for example, to “potential vorticity thinking” (Hoskins et al. 1985). Mathematically, one imagines an invariant manifold within the phase space of solutions, called the slow manifold (Leith 1980; Lorenz 1980), to which the dynamics are constrained. Whether this slow manifold truly exists has been a major subject of inquiry over the past 20 years. It has become clear that an exact slow manifold is an exceptional case, restricted to steady or perhaps temporally periodic flows (Warn 1997). Thus the concept of a “fuzzy slow manifold” (Warn and Ménard 1986) has been suggested. The idea is that nearly slow dynamics will occur in a stochastic layer about the putative slow manifold. The natural question then is, how thick is this layer? In a recent paper, Ford et al. (2000) argue that Lighthill emission—the spontaneous emission of freely propagating acoustic waves by unsteady vortical flows—is applicable to the problem of balance, with the Mach number Ma replaced by the Froude number F, and that it is a fundamental mechanism for this fuzziness. They consider the rotating shallow-water equations and find emission of inertia–gravity waves at O(F2). This is rather surprising at first sight, because several studies of balanced dynamics with the rotating shallow-water equations have gone beyond second order in F, and found only an exponentially small unbalanced component (Warn and Ménard 1986; Lorenz and Krishnamurthy 1987; Bokhove and Shepherd 1996; Wirosoetisno and Shepherd 2000). We have no technical objection to the analysis of Ford et al. (2000), but wish to point out that it depends crucially on R 1, where R is the Rossby number. This condition requires the ratio of the characteristic length scale of the flow L to the Rossby deformation radius LR to go to zero in the limit F → 0. This is the low Froude number scaling of Charney (1963), which, while originally designed for the Tropics, has been argued to be also relevant to mesoscale dynamics (Riley et al. 1981). If L/LR is fixed, however, then F → 0 implies R → 0, which is the standard quasigeostrophic scaling of Charney (1948; see, e.g., Pedlosky 1987). In this limit there is reason to expect the fuzziness of the slow manifold to be “exponentially thin,” and balance to be much more accurate than is consistent with (algebraic) Lighthill emission.
Resumo:
This Forum challenges and problematizes the term incomplete acquisition, which has been widely used to describe the state of competence of heritage speaker (HS) bilinguals for well over a decade (see, e.g., Montrul, 2008). It is suggested and defended that HS competence, while often different from monolingual peers, is in fact not incomplete (given any reasonable definition by the word incomplete), but simply distinct for reasons related to the realities of their environment.
Resumo:
Communication signal processing applications often involve complex-valued (CV) functional representations for signals and systems. CV artificial neural networks have been studied theoretically and applied widely in nonlinear signal and data processing [1–11]. Note that most artificial neural networks cannot be automatically extended from the real-valued (RV) domain to the CV domain because the resulting model would in general violate Cauchy-Riemann conditions, and this means that the training algorithms become unusable. A number of analytic functions were introduced for the fully CV multilayer perceptrons (MLP) [4]. A fully CV radial basis function (RBF) nework was introduced in [8] for regression and classification applications. Alternatively, the problem can be avoided by using two RV artificial neural networks, one processing the real part and the other processing the imaginary part of the CV signal/system. A even more challenging problem is the inverse of a CV
Resumo:
A study of the use of hybrid physical appearance both to signal and to explore the disputed paternity of Alexander the Great throughout its vernacular French tradition. The article compares the 'child of Babylon' portent and Alexander's son Alior in the twelfth-century French "Roman d'Alexandre" poem cycle, and a fifteenth-century prose adaptation of it.
Resumo:
Problem-Based Learning, despite recent controversies about its effectiveness, is used extensively as a teaching method throughout higher education. In meteorology, there has been little attempt to incorporate Problem-Based Learning techniques into the curriculum. Motivated by a desire to enhance the reflective engagement of students within a current field course module, this project describes the implementation of two test Problem-Based Learning activities and testing and improvement using several different and complementary means of evaluation. By the end of a 2-year program of design, implementation, testing, and reflection and re-evaluation, two robust, engaging activities have been developed that provide an enhanced and diverse learning environment in the field course. The results suggest that Problem-Based Learning techniques would be a useful addition to the meteorology curriculum and suggestions for courses and activities that may benefit from this approach are included in the conclusions.
Resumo:
In a series of papers, Killworth and Blundell have proposed to study the effects of a background mean flow and topography on Rossby wave propagation by means of a generalized eigenvalue problem formulated in terms of the vertical velocity, obtained from a linearization of the primitive equations of motion. However, it has been known for a number of years that this eigenvalue problem contains an error, which Killworth was prevented from correcting himself by his unfortunate passing and whose correction is therefore taken up in this note. Here, the author shows in the context of quasigeostrophic (QG) theory that the error can ulti- mately be traced to the fact that the eigenvalue problem for the vertical velocity is fundamentally a non- linear one (the eigenvalue appears both in the numerator and denominator), unlike that for the pressure. The reason that this nonlinear term is lacking in the Killworth and Blundell theory comes from neglecting the depth dependence of a depth-dependent term. This nonlinear term is shown on idealized examples to alter significantly the Rossby wave dispersion relation in the high-wavenumber regime but is otherwise irrelevant in the long-wave limit, in which case the eigenvalue problems for the vertical velocity and pressure are both linear. In the general dispersive case, however, one should first solve the generalized eigenvalue problem for the pressure vertical structure and, if needed, diagnose the vertical velocity vertical structure from the latter.
Resumo:
The question of what explains variation in expenditures on Active Labour Market Programs (ALMPs) has attracted significant scholarship in recent years. Significant insights have been gained with respect to the role of employers, unions and dual labour markets, openness, and partisanship. However, there remain significant disagreements with respects to key explanatory variables such the role of unions or the impact of partisanship. Qualitative studies have shown that there are both good conceptual reasons as well as historical evidence that different ALMPs are driven by different dynamics. There is little reason to believe that vastly different programs such as training and employment subsidies are driven by similar structural, interest group or indeed partisan dynamics. The question is therefore whether different ALMPs have the same correlation with different key explanatory variables identified in the literature? Using regression analysis, this paper shows that the explanatory variables identified by the literature have different relation to distinct ALMPs. This refinement adds significant analytical value and shows that disagreements are at least partly due to a dependent variable problem of ‘over-aggregation’.
Resumo:
The problem of technology obsolescence in information intensive businesses (software and hardware no longer being supported and replaced by improved and different solutions) and a cost constrained market can severely increase costs and operational, and ultimately reputation risk. Although many businesses recognise technological obsolescence, the pervasive nature of technology often means they have little information to identify the risk and location of pending obsolescence and little money to apply to the solution. This paper presents a low cost structured method to identify obsolete software and the risk of their obsolescence where the structure of a business and its supporting IT resources can be captured, modelled, analysed and the risk to the business of technology obsolescence identified to enable remedial action using qualified obsolescence information. The technique is based on a structured modelling approach using enterprise architecture models and a heatmap algorithm to highlight high risk obsolescent elements. The method has been tested and applied in practice in three consulting studies carried out by Capgemini involving four UK police forces. However the generic technique could be applied to any industry based on plans to improve it using ontology framework methods. This paper contains details of enterprise architecture meta-models and related modelling.
Resumo:
The problem of technology obsolescence in information intensive businesses (software and hardware no longer being supported and replaced by improved and different solutions) and a cost constrained market can severely increase costs and operational, and ultimately reputation risk. Although many businesses recognise technological obsolescence, the pervasive nature of technology often means they have little information to identify the risk and location of pending obsolescence and little money to apply to the solution. This paper presents a low cost structured method to identify obsolete software and the risk of their obsolescence where the structure of a business and its supporting IT resources can be captured, modelled, analysed and the risk to the business of technology obsolescence identified to enable remedial action using qualified obsolescence information. The technique is based on a structured modelling approach using enterprise architecture models and a heatmap algorithm to highlight high risk obsolescent elements. The method has been tested and applied in practice in two consulting studies carried out by Capgemini involving three UK police forces. However the generic technique could be applied to any industry based on plans to improve it using ontology framework methods. This paper contains details of enterprise architecture meta-models and related modelling.
Resumo:
The response of the Southern Ocean to a repeating seasonal cycle of ozone loss is studied in two coupled climate models and found to comprise both fast and slow processes. The fast response is similar to the inter-annual signature of the Southern Annular Mode (SAM) on Sea Surface Temperature (SST), on to which the ozone-hole forcing projects in the summer. It comprises enhanced northward Ekman drift inducing negative summertime SST anomalies around Antarctica, earlier sea ice freeze-up the following winter, and northward expansion of the sea ice edge year-round. The enhanced northward Ekman drift, however, results in upwelling of warm waters from below the mixed layer in the region of seasonal sea ice. With sustained bursts of westerly winds induced by ozone-hole depletion, this warming from below eventually dominates over the cooling from anomalous Ekman drift. The resulting slow-timescale response (years to decades) leads to warming of SSTs around Antarctica and ultimately a reduction in sea-ice cover year-round. This two-timescale behavior - rapid cooling followed by slow but persistent warming - is found in the two coupled models analysed, one with an idealized geometry, the other a complex global climate model with realistic geometry. Processes that control the timescale of the transition from cooling to warming, and their uncertainties are described. Finally we discuss the implications of our results for rationalizing previous studies of the effect of the ozone-hole on SST and sea-ice extent. %Interannual variability in the Southern Annular Mode (SAM) and sea ice covary such that an increase and southward shift in the surface westerlies (a positive phase of the SAM) coincides with a cooling of Sea Surface Temperature (SST) around 70-50$^\circ$S and an expansion of the sea ice cover, as seen in observations and models alike. Yet, in modeling studies, the Southern Ocean warms and sea ice extent decreases in response to sustained, multi-decadal positive SAM-like wind anomalies driven by 20th century ozone depletion. Why does the Southern Ocean appear to have disparate responses to SAM-like variability on interannual and multidecadal timescales? Here it is demonstrated that the response of the Southern Ocean to ozone depletion has a fast and a slow response. The fast response is similar to the interannual variability signature of the SAM. It is dominated by an enhanced northward Ekman drift, which transports heat northward and causes negative SST anomalies in summertime, earlier sea ice freeze-up the following winter, and northward expansion of the sea ice edge year round. The enhanced northward Ekman drift causes a region of Ekman divergence around 70-50$^\circ$S, which results in upwelling of warmer waters from below the mixed layer. With sustained westerly wind enhancement in that latitudinal band, the warming due to the anomalous upwelling of warm waters eventually dominates over the cooling from the anomalous Ekman drift. Hence, the slow response ultimately results in a positive SST anomaly and a reduction in the sea ice cover year round. We demonstrate this behavior in two models: one with an idealized geometry and another, more detailed, global climate model. However, the models disagree on the timescale of transition from the fast (cooling) to the slow (warming) response. Processes that controls this transition and their uncertainties are discussed.