919 resultados para Coded aperture


Relevância:

10.00% 10.00%

Publicador:

Resumo:

An important part of strategic planning’s purpose should be to attempt to forecast the future, not simply to belatedly respond to events, or accept the future as inevitable. This paper puts forward a conceptual approach for seeking to achieve these aims and uses the Bournemouth and Poole area in Dorset as a vehicle for applying the basic methodology. The area has been chosen because of the significant issues that it currently faces in planning terms; and its future development possibilities. In order that alternative future choices for the area – different ‘developmental trajectories’ – can be evaluated, they must be carefully and logically constructed. Four Futures for Bournemouth/Poole have been put forward; they are titled and colour-coded: Future One is Maximising Growth – Golden Prospect which seeks to achieve the highest level of economic prosperity of the area; Future Two is Incremental Growth – Solid Silver which attempts to facilitate a steady, continuing, controlled pattern of the development for the area; Future Three is Steady State – Cobalt Blue which suggests that people in the area could be more concerned with preserving their quality of life in terms of their leisure and recreation rather than increasing wealth; Future Four is Environment First – Jade Green which makes the area’s environmental protection its top priority even at the possible expense of economic prosperity. The scenarios proposed here are not sacrosanct. Nor are they simply confined to the Bournemouth and Poole area. In theory, suitably modified, they could use in a variety of different contexts. Consideration of the scenarios – wherever located - might then generate other, additional scenarios. These are called hybrids, alloys and amalgams. Likewise it might identify some of them as inappropriate or impossible. Most likely, careful consideration of the scenarios will suggest hybrid scenarios, in which features from different scenarios are combined to produce alternative or additional futures for consideration. The real issue then becomes how best to fashion such a future for the particular area under consideration

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article surveys the fiercely contested posthumous assessments of John Stuart Mill in the newspaper and periodical press, in the months following his death in May 1873, and elicits the broader intellectual context. Judgements made in the immediate wake of Mill's death influence biographers and historians to this day and provide an illuminating aperture into the politics and shifting ideological forces of the period. The article considers how Mill's failure to control his posthumous reputation demonstrates both the inextricable intertwining of politics and character in the 1870s, and the difficulties his allies faced. In particular, it shows the sharp division between Mill's middle and working class admirers; the use of James Mill's name as a rebuke to his son; the redefinition of Malthusianism in the 1870s; and how publication of Mill's Autobiography damaged his reputation. Finally, the article considers the relative absence of both theological and Darwinian critiques of Mill.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coded orthogonal frequency division multiplexing (COFDM) has existed for many years but it was not until 1997 when the European Telecommunications Standards Institute proposed its use for the transmission of digital television through a terrestrial channel. Up to date, an assumption has been made concerning the resilience of COFDM in a multipath environment. This paper discusses this assumption, give results of a DVB-T compliant simulation and discuss the validity of this assumption based on the obtained results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describe a simulation program, which uses Trengenza’s average room illuminance method in conjunction with hourly solar irradiance and luminous efficacy, to predict the potential lighting energy saving for a side-lit room. Two lighting control algorithms of photoelectric switching (on/off) and photoelectric dimming (top-up) have been coded in the program. A simulation for a typical UK office room has been conducted and the results show that energy saving due to the sunlight dependent on the various factors such as orientation, control methods, building depth, glazing area and shading types, etc. This simple tool can be used for estimating the potential lighting energy saving of the windows with various shading devices at the early design stage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cladistic analyses begin with an assessment of variation for a group of organisms and the subsequent representation of that variation as a data matrix. The step of converting observed organismal variation into a data matrix has been considered subjective, contentious, under-investigated, imprecise, unquantifiable, intuitive, as a black-box, and at the same time as ultimately the most influential phase of any cladistic analysis (Pimentel and Riggins, 1987; Bryant, 1989; Pogue and Mickevich, 1990; de Pinna, 1991; Stevens, 1991; Bateman et al., 1992; Smith, 1994; Pleijel, 1995; Wilkinson, 1995; Patterson and Johnson, 1997). Despite the concerns of these authors, primary homology assessment is often perceived as reproducible. In a recent paper, Hawkins et al. (1997) reiterated two points made by a number of these authors: that different interpretations of characters and coding are possible and that different workers will perceive and define characters in different ways. One reviewer challenged us: did we really think that two people working on the same group would come up with different data sets? The conflicting views regarding the reproducibility of the cladistic character matrix provoke a number of questions. Do the majority of workers consistently follow the same guidelines? Has the theoretical framework informing primary homology assessment been adequately explored? The objective of this study is to classify approaches to primary homology assessment, and to quantify the extent to which different approaches are found in the literature by examining variation in the way characters are defined and coded in a data matrix.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: Since its introduction in 2006, messages posted to the microblogging system Twitter have provided a rich dataset for researchers, leading to the publication of over a thousand academic papers. This paper aims to identify this published work and to classify it in order to understand Twitter based research. DESIGN/METHODOLOGY/APPROACH: Firstly the papers on Twitter were identified. Secondly, following a review of the literature, a classification of the dimensions of microblogging research was established. Thirdly, papers were qualitatively classified using open coded content analysis, based on the paper’s title and abstract, in order to analyze method, subject, and approach. FINDINGS: The majority of published work relating to Twitter concentrates on aspects of the messages sent and details of the users. A variety of methodological approaches are used across a range of identified domains. RESEARCH LIMITATIONS/IMPLICATIONS: This work reviewed the abstracts of all papers available via database search on the term “Twitter” and this has two major implications: 1) the full papers are not considered and so works may be misclassified if their abstract is not clear, 2) publications not indexed by the databases, such as book chapters, are not included. ORIGINALITY/VALUE: To date there has not been an overarching study to look at the methods and purpose of those using Twitter as a research subject. Our major contribution is to scope out papers published on Twitter until the close of 2011. The classification derived here will provide a framework within which researchers studying Twitter related topics will be able to position and ground their work

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The task of this paper is to develop a Time-Domain Probe Method for the reconstruction of impenetrable scatterers. The basic idea of the method is to use pulses in the time domain and the time-dependent response of the scatterer to reconstruct its location and shape. The method is based on the basic causality principle of timedependent scattering. The method is independent of the boundary condition and is applicable for limited aperture scattering data. In particular, we discuss the reconstruction of the shape of a rough surface in three dimensions from time-domain measurements of the scattered field. In practise, measurement data is collected where the incident field is given by a pulse. We formulate the time-domain fieeld reconstruction problem equivalently via frequency-domain integral equations or via a retarded boundary integral equation based on results of Bamberger, Ha-Duong, Lubich. In contrast to pure frequency domain methods here we use a time-domain characterization of the unknown shape for its reconstruction. Our paper will describe the Time-Domain Probe Method and relate it to previous frequency-domain approaches on sampling and probe methods by Colton, Kirsch, Ikehata, Potthast, Luke, Sylvester et al. The approach significantly extends recent work of Chandler-Wilde and Lines (2005) and Luke and Potthast (2006) on the timedomain point source method. We provide a complete convergence analysis for the method for the rough surface scattering case and provide numerical simulations and examples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Very high-resolution Synthetic Aperture Radar sensors represent an alternative to aerial photography for delineating floods in built-up environments where flood risk is highest. However, even with currently available SAR image resolutions of 3 m and higher, signal returns from man-made structures hamper the accurate mapping of flooded areas. Enhanced image processing algorithms and a better exploitation of image archives are required to facilitate the use of microwave remote sensing data for monitoring flood dynamics in urban areas. In this study a hybrid methodology combining radiometric thresholding, region growing and change detection is introduced as an approach enabling the automated, objective and reliable flood extent extraction from very high-resolution urban SAR images. The method is based on the calibration of a statistical distribution of “open water” backscatter values inferred from SAR images of floods. SAR images acquired during dry conditions enable the identification of areas i) that are not “visible” to the sensor (i.e. regions affected by ‘layover’ and ‘shadow’) and ii) that systematically behave as specular reflectors (e.g. smooth tarmac, permanent water bodies). Change detection with respect to a pre- or post flood reference image thereby reduces over-detection of inundated areas. A case study of the July 2007 Severn River flood (UK) observed by the very high-resolution SAR sensor on board TerraSAR-X as well as airborne photography highlights advantages and limitations of the proposed method. We conclude that even though the fully automated SAR-based flood mapping technique overcomes some limitations of previous methods, further technological and methodological improvements are necessary for SAR-based flood detection in urban areas to match the flood mapping capability of high quality aerial photography.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Robotic multiwell planar patch-clamp has become common in drug development and safety programs because it enables efficient and systematic testing of compounds against ion channels during voltage-clamp. It has not, however, been adopted significantly in other important areas of ion channel research, where conventional patch-clamp remains the favored method. Here, we show the wider potential of the multiwell approach with the ability for efficient intracellular solution exchange, describing protocols and success rates for recording from a range of native and primary mammalian cells derived from blood vessels, arthritic joints and the immune and central nervous systems. The protocol involves preparing a suspension of single cells to be dispensed robotically into 4-8 microfluidic chambers each containing a glass chip with a small aperture. Under automated control, giga-seals and whole-cell access are achieved followed by preprogrammed routines of voltage paradigms and fast extracellular or intracellular solution exchange. Recording from 48 chambers usually takes 1-6 h depending on the experimental design and yields 16-33 cell recordings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

'The Prophetic Sound: a day and night of noise cabaret' is the first event hosted by Agency of Noise. This all day event brought together artists and academics whose subject of focus is noise (in creative practice). Artists from across the UK were invited to consider a future post-digital era in which everything with a microchip has malfunctioned, as a thought exercise and starting point for response through sound. In response to Jacques Attali’s claim that music is prophecy, The Prophetic Sound asks us to consider if noise can communicate in an unbridled, unfiltered, way that is somehow not culturally coded -before it becomes sound that is recognised, refined, manipulated and exploited for musical or other cultured purpose. Featuring students from Reading, Brighton, LCC and Goldsmiths alongside more established artists and academics from across the UK, this event brings into focus locations where pattern, timbre, pitch, organisation and sequencing of sounds become distinguishable from noise and asks us to consider, through diversion within such locations, new origins for future communication systems. The Prophetic Sound included talks, films, presentations and performances from: Ryo Ikeshiro / Inigo Wilkins / Neal Spowage / Dane Sutherland / Poulomi Desai / Benedict Drew / AAS / Polly Fibre / Steven Dickie As part of The Prophetic Sound, POLLYFIBRE (Ellison, C.) performed LIVE RECORDING with Amplified Scissors. This industrial activity by POLLYFIBRE short-circuits the complicated chain that is music production. The distinctive roles of consumer, producer, composer, and performer collapse in a series of live ‘cuts’ where vinyl discs are produced with amplified scissors. Production happens through action and action becomes production. A limited edition of 9 flexi discs were produced and available for special collectors at the event.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – This paper aims to explore the nature of the emerging discourse of private climate change reporting, which takes place in one-on-one meetings between institutional investors and their investee companies. Design/methodology/approach – Semi-structured interviews were conducted with representatives from 20 UK investment institutions to derive data which was then coded and analysed, in order to derive a picture of the emerging discourse of private climate change reporting, using an interpretive methodological approach, in addition to explorative analysis using NVivo software. Findings – The authors find that private climate change reporting is dominated by a discourse of risk and risk management. This emerging risk discourse derives from institutional investors' belief that climate change represents a material risk, that it is the most salient sustainability issue, and that their clients require them to manage climate change-related risk within their portfolio investment. It is found that institutional investors are using the private reporting process to compensate for the acknowledged inadequacies of public climate change reporting. Contrary to evidence indicating corporate capture of public sustainability reporting, these findings suggest that the emerging private climate change reporting discourse is being captured by the institutional investment community. There is also evidence of an emerging discourse of opportunity in private climate change reporting as the institutional investors are increasingly aware of a range of ways in which climate change presents material opportunities for their investee companies to exploit. Lastly, the authors find an absence of any ethical discourse, such that private climate change reporting reinforces rather than challenges the “business case” status quo. Originality/value – Although there is a wealth of sustainability reporting research, there is no academic research on private climate change reporting. This paper attempts to fill this gap by providing rich interview evidence regarding the nature of the emerging private climate change reporting discourse.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – The purpose of the research was to discover the process of social and environmental report assurance (SERA) and thereby evaluate the benefits, extent of stakeholder inclusivity and/or managerial capture of SERA processes and the dynamics of SERA as it matures. Design/methodology/approach – This paper used semi-structured interviews with 20 accountant and consultant assurors to derive data, which were then coded and analysed, resulting in the identification of four themes. Findings – This paper provides interview evidence on the process of SERA, suggesting that, although there is still managerial capture of SERA, stakeholders are being increasingly included in the process as it matures. SERA is beginning to provide dual-pronged benefits, adding value to management and stakeholders simultaneously. Through the lens of Freirian dialogic theory, it is found that SERA is starting to display some characteristics of a dialogical process, being stakeholder inclusive, demythologising and transformative, with assurors perceiving themselves as a “voice” for stakeholders. Consequently, SERA is becoming an important mechanism for driving forward more stakeholder-inclusive SER, with the SERA process beginning to transform attitudes of management towards their stakeholders through more stakeholder-led SER. However, there remain significant obstacles to dialogic SERA. The paper suggests these could be removed through educative and transformative processes driven by assurors. Originality/value – Previous work on SERA has involved predominantly content-based analysis on assurance statements. However, this paper investigates the details of the SERA process, for the first time using qualitative interview data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Satellite-based Synthetic Aperture Radar (SAR) has proved useful for obtaining information on flood extent, which, when intersected with a Digital Elevation Model (DEM) of the floodplain, provides water level observations that can be assimilated into a hydrodynamic model to decrease forecast uncertainty. With an increasing number of operational satellites with SAR capability, information on the relationship between satellite first visit and revisit times and forecast performance is required to optimise the operational scheduling of satellite imagery. By using an Ensemble Transform Kalman Filter (ETKF) and a synthetic analysis with the 2D hydrodynamic model LISFLOOD-FP based on a real flooding case affecting an urban area (summer 2007,Tewkesbury, Southwest UK), we evaluate the sensitivity of the forecast performance to visit parameters. We emulate a generic hydrologic-hydrodynamic modelling cascade by imposing a bias and spatiotemporal correlations to the inflow error ensemble into the hydrodynamic domain. First, in agreement with previous research, estimation and correction for this bias leads to a clear improvement in keeping the forecast on track. Second, imagery obtained early in the flood is shown to have a large influence on forecast statistics. Revisit interval is most influential for early observations. The results are promising for the future of remote sensing-based water level observations for real-time flood forecasting in complex scenarios.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter presents techniques used for the generation of 3D digital elevation models (DEMs) from remotely sensed data. Three methods are explored and discussed—optical stereoscopic imagery, Interferometric Synthetic Aperture Radar (InSAR), and LIght Detection and Ranging (LIDAR). For each approach, the state-of-the-art presented in the literature is reviewed. Techniques involved in DEM generation are presented with accuracy evaluation. Results of DEMs reconstructed from remotely sensed data are illustrated. While the processes of DEM generation from satellite stereoscopic imagery represents a good example of passive, multi-view imaging technology, discussed in Chap. 2 of this book, InSAR and LIDAR use different principles to acquire 3D information. With regard to InSAR and LIDAR, detailed discussions are conducted in order to convey the fundamentals of both technologies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Since their inception, Twitter and related microblogging systems have provided a rich source of information for researchers and have attracted interest in their affordances and use. Since 2009 PubMed has included 123 journal articles on medicine and Twitter, but no overview exists as to how the field uses Twitter in research. // Objective: This paper aims to identify published work relating to Twitter indexed by PubMed, and then to classify it. This classification will provide a framework in which future researchers will be able to position their work, and to provide an understanding of the current reach of research using Twitter in medical disciplines. Limiting the study to papers indexed by PubMed ensures the work provides a reproducible benchmark. // Methods: Papers, indexed by PubMed, on Twitter and related topics were identified and reviewed. The papers were then qualitatively classified based on the paper’s title and abstract to determine their focus. The work that was Twitter focused was studied in detail to determine what data, if any, it was based on, and from this a categorization of the data set size used in the studies was developed. Using open coded content analysis additional important categories were also identified, relating to the primary methodology, domain and aspect. // Results: As of 2012, PubMed comprises more than 21 million citations from biomedical literature, and from these a corpus of 134 potentially Twitter related papers were identified, eleven of which were subsequently found not to be relevant. There were no papers prior to 2009 relating to microblogging, a term first used in 2006. Of the remaining 123 papers which mentioned Twitter, thirty were focussed on Twitter (the others referring to it tangentially). The early Twitter focussed papers introduced the topic and highlighted the potential, not carrying out any form of data analysis. The majority of published papers used analytic techniques to sort through thousands, if not millions, of individual tweets, often depending on automated tools to do so. Our analysis demonstrates that researchers are starting to use knowledge discovery methods and data mining techniques to understand vast quantities of tweets: the study of Twitter is becoming quantitative research. // Conclusions: This work is to the best of our knowledge the first overview study of medical related research based on Twitter and related microblogging. We have used five dimensions to categorise published medical related research on Twitter. This classification provides a framework within which researchers studying development and use of Twitter within medical related research, and those undertaking comparative studies of research relating to Twitter in the area of medicine and beyond, can position and ground their work.