840 resultados para Problems and potentials
Resumo:
In this paper we are mainly concerned with the development of efficient computer models capable of accurately predicting the propagation of low-to-middle frequency sound in the sea, in axially symmetric (2D) and in fully 3D environments. The major physical features of the problem, i.e. a variable bottom topography, elastic properties of the subbottom structure, volume attenuation and other range inhomogeneities are efficiently treated. The computer models presented are based on normal mode solutions of the Helmholtz equation on the one hand, and on various types of numerical schemes for parabolic approximations of the Helmholtz equation on the other. A new coupled mode code is introduced to model sound propagation in range-dependent ocean environments with variable bottom topography, where the effects of an elastic bottom, of volume attenuation, surface and bottom roughness are taken into account. New computer models based on finite difference and finite element techniques for the numerical solution of parabolic approximations are also presented. They include an efficient modeling of the bottom influence via impedance boundary conditions, they cover wide angle propagation, elastic bottom effects, variable bottom topography and reverberation effects. All the models are validated on several benchmark problems and versus experimental data. Results thus obtained were compared with analogous results from standard codes in the literature.
Resumo:
One of the aims of a broad ethnographic study into how the apportionment of risk influences pricing levels of contactors was to ascertain the significant risks affecting contractors in Ghana, and their impact on prices. To do this, in the context of contractors, the difference between expected and realized return on a project is the key dependent variable examined using documentary analyses and semi-structured interviews. Most work in this has focused on identifying and prioritising risks using relative importance indices generated from the analysis of questionnaire survey responses. However, this approach may be argued to constitute perceptions rather than direct measures of the project risk. Here, instead, project risk is investigated by examining two measures of the same quantity; one ‘before’ and one ‘after’ construction of a project has taken place. Risks events are identified by ascertaining the independent variables causing deviations between expected and actual rates of return. Risk impact is then measured by ascertaining additions or reductions to expected costs due to the occurrence of risk events. So far, data from eight substantially complete building projects indicates that consultants’ inefficiency, payment delays, subcontractor-related problems and changes in macroeconomic factors are significant risks affecting contractors in Ghana.
Resumo:
We explore the contribution of socio-technical networks approaches to construction management research. These approaches are distinctive for their analysis of actors and objects as mutually constituted within socio-technical networks. They raise questions about the ways in which the content, meaning and use of technology is negotiated in practice, how particular technical configurations are elaborated in response to specific problems and why certain paths or solutions are adopted rather than others. We illustrate this general approach with three case studies: a historical study of the development of reinforced concrete in France, the UK and the US, the recent introduction of 3D-CAD software into four firms and an analysis of the uptake of environmental assessment technologies in the UK since 1990. In each we draw out the ways in which various technologies shaped and were shaped by different socio-technical networks. We conclude with a reflection on the contributions of socio-technical network analysis for more general issues including the study of innovation and analyses of context and power.
Resumo:
The academic discipline of television studies has been constituted by the claim that television is worth studying because it is popular. Yet this claim has also entailed a need to defend the subject against the triviality that is associated with the television medium because of its very popularity. This article analyses the many attempts in the later twentieth and twenty-first centuries to constitute critical discourses about television as a popular medium. It focuses on how the theoretical currents of Television Studies emerged and changed in the UK, where a disciplinary identity for the subject was founded by borrowing from related disciplines, yet argued for the specificity of the medium as an object of criticism. Eschewing technological determinism, moral pathologization and sterile debates about television's supposed effects, UK writers such as Raymond Williams addressed television as an aspect of culture. Television theory in Britain has been part of, and also separate from, the disciplinary fields of media theory, literary theory and film theory. It has focused its attention on institutions, audio-visual texts, genres, authors and viewers according to the ways that research problems and theoretical inadequacies have emerged over time. But a consistent feature has been the problem of moving from a descriptive discourse to an analytical and evaluative one, and from studies of specific texts, moments and locations of television to larger theories. By discussing some historically significant critical work about television, the article considers how academic work has constructed relationships between the different kinds of objects of study. The article argues that a fundamental tension between descriptive and politically activist discourses has confused academic writing about ›the popular‹. Television study in Britain arose not to supply graduate professionals to the television industry, nor to perfect the instrumental techniques of allied sectors such as advertising and marketing, but to analyse and critique the medium's aesthetic forms and to evaluate its role in culture. Since television cannot be made by ›the people‹, the empowerment that discourses of television theory and analysis aimed for was focused on disseminating the tools for critique. Recent developments in factual entertainment television (in Britain and elsewhere) have greatly increased the visibility of ›the people‹ in programmes, notably in docusoaps, game shows and other participative formats. This has led to renewed debates about whether such ›popular‹ programmes appropriately represent ›the people‹ and how factual entertainment that is often despised relates to genres hitherto considered to be of high quality, such as scripted drama and socially-engaged documentary television. A further aspect of this problem of evaluation is how television globalisation has been addressed, and the example that the issue has crystallised around most is the reality TV contest Big Brother. Television theory has been largely based on studying the texts, institutions and audiences of television in the Anglophone world, and thus in specific geographical contexts. The transnational contexts of popular television have been addressed as spaces of contestation, for example between Americanisation and national or regional identities. Commentators have been ambivalent about whether the discipline's role is to celebrate or critique television, and whether to do so within a national, regional or global context. In the discourses of the television industry, ›popular television‹ is a quantitative and comparative measure, and because of the overlap between the programming with the largest audiences and the scheduling of established programme types at the times of day when the largest audiences are available, it has a strong relationship with genre. The measurement of audiences and the design of schedules are carried out in predominantly national contexts, but the article refers to programmes like Big Brother that have been broadcast transnationally, and programmes that have been extensively exported, to consider in what ways they too might be called popular. Strands of work in television studies have at different times attempted to diagnose what is at stake in the most popular programme types, such as reality TV, situation comedy and drama series. This has centred on questions of how aesthetic quality might be discriminated in television programmes, and how quality relates to popularity. The interaction of the designations ›popular‹ and ›quality‹ is exemplified in the ways that critical discourse has addressed US drama series that have been widely exported around the world, and the article shows how the two critical terms are both distinct and interrelated. In this context and in the article as a whole, the aim is not to arrive at a definitive meaning for ›the popular‹ inasmuch as it designates programmes or indeed the medium of television itself. Instead the aim is to show how, in historically and geographically contingent ways, these terms and ideas have been dynamically adopted and contested in order to address a multiple and changing object of analysis.
Resumo:
A technique is derived for solving a non-linear optimal control problem by iterating on a sequence of simplified problems in linear quadratic form. The technique is designed to achieve the correct solution of the original non-linear optimal control problem in spite of these simplifications. A mixed approach with a discrete performance index and continuous state variable system description is used as the basis of the design, and it is shown how the global problem can be decomposed into local sub-system problems and a co-ordinator within a hierarchical framework. An analysis of the optimality and convergence properties of the algorithm is presented and the effectiveness of the technique is demonstrated using a simulation example with a non-separable performance index.
Resumo:
The Arctic has undergone substantial changes over the last few decades in various cryospheric and derivative systems and processes. Of these, the Arctic sea ice regime has seen some of the most rapid change and is one of the most visible markers of Arctic change outside the scientific community. This has drawn considerable attention not only from the natural sciences, but increasingly, from the political and commercial sectors as they begin to grapple with the problems and opportunities that are being presented. The possible impacts of past and projected changes in Arctic sea ice, especially as it relates to climatic response, are of particular interest and have been the subject of increasing research activity. A review of the current knowledge of the role of sea ice in the climate system is therefore timely. We present a review that examines both the current state of understanding, as regards the impacts of sea-ice loss observed to date, and climate model projections, to highlight hypothesised future changes and impacts on storm tracks and the North Atlantic Oscillation. Within the broad climate-system perspective, the topics of storminess and large-scale variability will be specifically considered. We then consider larger-scale impacts on the climatic system by reviewing studies that have focused on the interaction between sea-ice extent and the North Atlantic Oscillation. Finally, an overview of the representation of these topics in the literature in the context of IPCC climate projections is presented. While most agree on the direction of Arctic sea-ice change, the rates amongst the various projections vary greatly. Similarly, the response of storm tracks and climate variability are uncertain, exacerbated possibly by the influence of other factors. A variety of scientific papers on the relationship between sea-ice changes and atmospheric variability have brought to light important aspects of this complex topic. Examples are an overall reduction in the number of Arctic winter storms, a northward shift of mid-latitude winter storms in the Pacific and a delayed negative NAO-like response in autumn/winter to a reduced Arctic sea-ice cover (at least in some months). This review paper discusses this research and the disagreements, bringing about a fresh perspective on this issue.
Resumo:
Background: Biases in the interpretation of ambiguous material are central to cognitive models of anxiety; however, understanding of the association between interpretation and anxiety in childhood is limited. To address this, a prospective investigation of the stability and specificity of anxious cognitions and anxiety and the relationship between these factors was conducted. Method: Sixty-five children (10–11 years) from a community sample completed measures of self-reported anxiety, depression, and conduct problems, and responded to ambiguous stories at three time points over one-year. Results: Individual differences in biases in interpretation of ambiguity (specifically “anticipated distress” and “threat interpretation”) were stable over time. Furthermore, anticipated distress and threat interpretation were specifically associated with anxiety symptoms. Distress anticipation predicted change in anxiety symptoms over time. In contrast, anxiety scores predicted change in threat interpretation over time. Conclusions: The results suggest that different cognitive constructs may show different longitudinal links with anxiety. These preliminary findings extend research and theory on anxious cognitions and their link with anxiety in children, and suggest that these cognitive processes may be valuable targets for assessment and intervention.
Resumo:
This report describes the analysis and development of novel tools for the global optimisation of relevant mission design problems. A taxonomy was created for mission design problems, and an empirical analysis of their optimisational complexity performed - it was demonstrated that the use of global optimisation was necessary on most classes and informed the selection of appropriate global algorithms. The selected algorithms were then applied to the di®erent problem classes: Di®erential Evolution was found to be the most e±cient. Considering the speci¯c problem of multiple gravity assist trajectory design, a search space pruning algorithm was developed that displays both polynomial time and space complexity. Empirically, this was shown to typically achieve search space reductions of greater than six orders of magnitude, thus reducing signi¯cantly the complexity of the subsequent optimisation. The algorithm was fully implemented in a software package that allows simple visualisation of high-dimensional search spaces, and e®ective optimisation over the reduced search bounds.
Resumo:
Ethnopharmacological relevance: Studies on traditional Chinese medicine (TCM), like those of other systems of traditional medicine (TM), are very variable in their quality, content and focus, resulting in issues around their acceptability to the global scientific community. In an attempt to address these issues, an European Union funded FP7 consortium, composed of both Chinese and European scientists and named “Good practice in traditional Chinese medicine” (GP-TCM), has devised a series of guidelines and technical notes to facilitate good practice in collecting, assessing and publishing TCM literature as well as highlighting the scope of information that should be in future publications on TMs. This paper summarises these guidelines, together with what has been learned through GP-TCM collaborations, focusing on some common problems and proposing solutions. The recommendations also provide a template for the evaluation of other types of traditional medicine such as Ayurveda, Kampo and Unani. Materials and methods: GP-TCM provided a means by which experts in different areas relating to TCM were able to collaborate in forming a literature review good practice panel which operated through e-mail exchanges, teleconferences and focused discussions at annual meetings. The panel involved coordinators and representatives of each GP-TCM work package (WP) with the latter managing the testing and refining of such guidelines within the context of their respective WPs and providing feedback. Results: A Good Practice Handbook for Scientific Publications on TCM was drafted during the three years of the consortium, showing the value of such networks. A “deliverable – central questions – labour division” model had been established to guide the literature evaluation studies of each WP. The model investigated various scoring systems and their ability to provide consistent and reliable semi-quantitative assessments of the literature, notably in respect of the botanical ingredients involved and the scientific quality of the work described. This resulted in the compilation of (i) a robust scoring system and (ii) a set of minimum standards for publishing in the herbal medicines field, based on an analysis of the main problems identified in published TCM literature.
Resumo:
The current state of the art and direction of research in computer vision aimed at automating the analysis of CCTV images is presented. This includes low level identification of objects within the field of view of cameras, following those objects over time and between cameras, and the interpretation of those objects’ appearance and movements with respect to models of behaviour (and therefore intentions inferred). The potential ethical problems (and some potential opportunities) such developments may pose if and when deployed in the real world are presented, and suggestions made as to the necessary new regulations which will be needed if such systems are not to further enhance the power of the surveillers against the surveilled.
Resumo:
The first ECMWF Seminar in 1975 (ECMWF, 1975) considered the scientific foundation of medium range weather forecasts. It may be of interest as a part of this lecture, to review some of the ideas and opinions expressed during this seminar.
Resumo:
We consider an equilibrium birth and death type process for a particle system in infinite volume, the latter is described by the space of all locally finite point configurations on Rd. These Glauber type dynamics are Markov processes constructed for pre-given reversible measures. A representation for the ``carré du champ'' and ``second carré du champ'' for the associate infinitesimal generators L are calculated in infinite volume and for a large class of functions in a generalized sense. The corresponding coercivity identity is derived and explicit sufficient conditions for the appearance and bounds for the size of the spectral gap of L are given. These techniques are applied to Glauber dynamics associated to Gibbs measure and conditions are derived extending all previous known results and, in particular, potentials with negative parts can now be treated. The high temperature regime is extended essentially and potentials with non-trivial negative part can be included. Furthermore, a special class of potentials is defined for which the size of the spectral gap is as least as large as for the free system and, surprisingly, the spectral gap is independent of the activity. This type of potentials should not show any phase transition for a given temperature at any activity.