949 resultados para LEVEL SET
Resumo:
The relationship between monthly sea-level data measured at stations located along the Chinese coast and concurrent large-scale atmospheric forcing in the period 1960-1990 is examined. It is found that sea-level varies quite coherently along the whole coast, despite the geographical extension of the station set. A canonical correlation analysis between sea-level and sea-level pressure (SLP) indicates that a great part of the sea-level variability can be explained by the action of the wind stress on the ocean surface. The relationship between sea-level and sea-level pressure is analyzed separately for the summer and winter half-years. In winter, one factor affecting sea-level variability at all stations is the SLP contrast between the continent and the Pacific Ocean, hence the intensity of the winter Monsoon circulation. Another factor that affects coherently all stations is the intensity of the zonal circulation at mid-latitudes. In the summer half year, on the other hand, the influence of SLP on sea-level is spatially less coherent: the stations in the Yellow Sea are affected by a more localized circulation anomaly pattern, whereas the rest of the stations is more directly connected to the intensity of the zonal circulation. Based on this analysis, statistical models (different for summer and winter) to hindcast coastal sealevel anomalies from the large-scale SLP field are formulated. These models have been tested by fitting their internal parameters in a test period and reproducing reasonably the sea-level evolution in an independent period. These statistical models are also used to estimate the contribution of the changes of the atmospheric circulation on sea-level along the Chinese coast in an altered climate. For this purpose the ouput of 150 year-long experiment with the coupled ocean-atmosphere model ECHAM1-LSG has been analyzed, in which the atmospheric concentration of greenhouse gases was continuously increased from 1940 until 2090, according to the Scenario A projection of the Intergovermental Panel on Climate Change. In this experiment the meridional (zonal) circulation relevant for sea-level tends to become weaker (stronger) in the winter half year and stronger (weaker) in summer. The estimated contribution of this atmospheric circulation changes to coastal sea-level is of the order of a few centimeters at the end of the integration, being in winter negative in the Yellow Sea and positive in the China Sea with opposite signs in the summer half-year.
Resumo:
A major problem which is envisaged in the course of man-made climate change is sea-level rise. The global aspect of the thermal expansion of the sea water likely is reasonably well simulated by present day climate models; the variation of sea level, due to variations of the regional atmospheric forcing and of the large-scale oceanic circulation, is not adequately simulated by a global climate model because of insufficient spatial resolution. A method to infer the coastal aspects of sea level change is to use a statistical ''downscaling'' strategy: a linear statistical model is built upon a multi-year data set of local sea level data and of large-scale oceanic and/or atmospheric data such as sea-surface temperature or sea-level air-pressure. We apply this idea to sea level along the Japanese coast. The sea level is related to regional and North Pacific sea-surface temperature and sea-level air pressure. Two relevant processes are identified. One process is the local wind set-up of water due to regional low-frequency wind anomalies; the other is a planetary scale atmosphere-ocean interaction which takes place in the eastern North Pacific.
Resumo:
This report presents a system for generating a stable, feasible, and reachable grasp of a polyhedral object. A set of contact points on the object is found that can result in a stable grasp; a feasible grasp is found in which the robot contacts the object at those contact points; and a path is constructed from the initial configuration of the robot to the stable, feasible final grasp configuration. The algorithm described in the report is designed for the Salisbury hand mounted on a Puma 560 arm, but a similar approach could be used to develop grasping systems for other robots.
Resumo:
The conventional meaning of culture is ‘widely shared and strongly held values’ of a particular group or society (Bradley and Parker, 2006: 89). Culture is not a rigid concept; it can be influenced or altered by new ideas or forces. This research examines the ways in which one set of ideas in particular, that is, those associated with New Public Management, have impacted upon the administrative culture of 'street-level' bureaucrats and professionals within Irish social policy. Lipsky (1980: 3) defined 'street-level' bureaucrats as ‘public service workers who interact directly with citizens in the course of their jobs, and who have substantial discretion in the execution of their work’. Utilising the Competing Values Framework (CVF) in the analysis of eighty three semi-structured interviews with 'street-level' bureaucrats and professionals, an evaluation is made as to the impact of NPM ideas on both visible and invisible aspects of administrative culture. Overall, the influence of NPM is confined to superficial aspects of administrative culture such as; increased flexibility in working hours and to some degree job contracts; increased time commitment; and a customer service focus. However, the extent of these changes varies depending on policy sector and occupational group. Aspects of consensual and hierarchical cultures remain firmly in place. These coincide with features of developmental and market cultures. Contrary to the view that members of hierarchical and consensual culture would pose resistance to change, this research clearly illustrates that a very large appetite for change exists in the attitudes of 'street-level' bureaucrats and professionals within Irish social policy, with many of them suggesting changes that correspond to NPM ideas. This study demonstrates the relevance of employing the CVF model as it is clear that administrative culture is very much a dynamic system of competing and co-existing cultures.
Resumo:
© 2014 UICC.Cytokines such as Interleukin (IL)212p70 ("IL-12") and IL-23 can influence tumor progression. We tested the hypothesis that blood levels of IL-12p40, the common subunit of both cytokines, are associated with melanoma progression. Blood from 2,048 white melanoma patients were collected at a single institution between March 1998 and March 2011. Plasma levels of IL-12p40 were determined for 573 patients (discovery), 249 patients (Validation 1) and 244 patients (Validation 2). Per 10-unit change of IL-12p40 level was used to investigate associations with melanoma patient outcome among all patients or among patients with early or advanced stage. Among stage I/II melanoma patients in the pooled data set, after adjustment for sex, age, stage and blood draw time from diagnosis, elevated IL-12p40 was associated with melanoma recurrence [hazard ratio (HR)51.04 per 10-unit increase in IL-12p40, 95% CI 1.02-1.06, p58.48 × 10-5]; Elevated IL-12p40 was also associated with a poorer melanoma specific survival (HR51.06, 95% CI 1.03-1.09, p53.35 × 10-5) and overall survival (HR51.05, 95% CI 1.03-1.08, p58.78 × 10-7) in multivariate analysis. Among stage III/IV melanoma patients in the pooled data set, no significant association was detected between elevated IL-12p40 and overall survival, or with melanoma specific survival, with or without adjustment for the above covariates. Early stage melanoma patients with elevated IL-12p40 levels are more likely to develop disease recurrence and have a poorer survival. Further investigation with a larger sample size will be needed to determine the role of IL-12p40 in advanced stage melanoma patients.
Resumo:
This paper presents a simple approach to the so-called frame problem based on some ordinary set operations, which does not require non-monotonic reasoning. Following the notion of the situation calculus, we shall represent a state of the world as a set of fluents, where a fluent is simply a Boolean-valued property whose truth-value is dependent on the time. High-level causal laws are characterised in terms of relationships between actions and the involved world states. An effect completion axiom is imposed on each causal law, which guarantees that all the fluents that can be affected by the performance of the corresponding action are always totally governed. It is shown that, compared with other techniques, such a set operation based approach provides a simpler and more effective treatment to the frame problem.
Resumo:
Human activities within the marine environment give rise to a number of pressures on seabed habitats. Improved understanding of the sensitivity of subtidal sedimentary habitats is required to underpin the management advice provided for Marine Protected Areas, as well as supporting other UK marine monitoring and assessment work. The sensitivity of marine sedimentary habitats to a range of pressures induced by human activities has previously been systematically assessed using approaches based on expert judgement for Defra Project MB0102 (Tillin et al. 2010). This previous work assessed sensitivity at the level of the broadscale habitat and therefore the scores were typically expressed as a range due to underlying variation in the sensitivity of the constituent biotopes. The objective of this project was to reduce the uncertainty around identifying the sensitivity of selected subtidal sedimentary habitats by assessing sensitivity, at a finer scale and incorporating information on the biological assemblage, for 33 Level 5 circalittoral and offshore biotopes taken from the Marine Habitat Classification of Britain and Ireland (Connor et al. 2004). Two Level 6 sub-biotopes were also included in this project as these contain distinctive characterising species that differentiate them from the Level 5 parent biotope. Littoral, infralittoral, reduced and variable salinity sedimentary habitats were excluded from this project as the scope was set for assessment of circalittoral and offshore sedimentary communities. This project consisted of three Phases. • Phase 1 - define ecological groups based on similarities in the sensitivity of characterising species from the Level 5 and two Level 6 biotopes described above. • Phase 2 - produce a literature review of information on the resilience and resistance of characterising species of the ecological groups to pressures associated with activities in the marine environment. • Phase 3 - to produce sensitivity assessment ‘proformas’ based on the findings of Phase 2 for each ecological group. This report outlines results of Phase 2. The Tillin et al., (2010) sensitivity assessment methodology was modified to use the best available scientific evidence that could be collated within the project timescale. An extensive literature review was compiled, for peer reviewed and grey literature, to examine current understanding about the effects of pressures from human activities on circalittoral and offshore sedimentary communities in UK continental shelf waters, together with information on factors that contribute to resilience (recovery) of marine species. This review formed the basis of an assessment of the sensitivity of the 16 ecological groups identified in Phase 1 of the project (Tillin & Tyler-Walters 2014). As a result: • the state of knowledge on the effects of each pressure on circalittoral and offshore benthos was reviewed; • the resistance, resilience and, hence, sensitivity of sixteen ecological groups, representing 96 characteristic species, were assessed for eight separate pressures; • each assessment was accompanied by a detailed review of the relevant evidence; Assessing the sensitivity of subtidal sedimentary habitats to pressures associated with human activities • knowledge gaps and sources of uncertainty were identified for each group; • each assessment was accompanied by an assessment of the quality of the evidence, its applicability to the assessment and the degree of concordance (agreement) between the evidence, to highlight sources of uncertainty as an assessment of the overall confidence in the sensitivity assessment, and finally • limitations in the methodology and the application of sensitivity assessments were outlined. This process demonstrated that the ecological groups identified in Phase 1 (Tillin & Tyler-Walters 2014) were viable groups for sensitivity assessment, and could be used to represent the 33 circalittoral and offshore sediments biotopes identified at the beginning of the project. The results of the sensitivity assessments show: • the majority of species and hence ecological groups in sedimentary habitats are sensitive to physical change, especially loss of habitat and sediment extraction, and change in sediment type; • most sedimentary species are sensitive to physical damage, e.g. abrasion and penetration, although deep burrowing species (e.g. the Dublin Bay prawn - Nephrops norvegicus and the sea cucumber - Neopentadactyla mixta) are able to avoid damaging effects to varying degrees, depending on the depth of penetration and time of year; • changes in hydrography (wave climate, tidal streams and currents) can significantly affect sedimentary communities, depending on whether they are dominated by deposit, infaunal feeders or suspension feeders, and dependant on the nature of the sediment, which is itself modified by hydrography and depth; • sedentary species and ecological groups that dominate the top-layer of the sediment (either shallow burrowing or epifaunal) remain the most sensitive to physical damage; • mobile species (e.g. interstitial and burrowing amphipods, and perhaps cumaceans) are the least sensitive to physical change or damage, and hydrological change as they are already adapted to unstable, mobile substrata; • sensitivity to changes in organic enrichment and hence oxygen levels, is variable between species and ecological groups, depending on the exact habitat preferences of the species in question, although most species have at least a medium sensitivity to acute deoxygenation; • there is considerable evidence on the effects of bottom-contact fishing practices and aggregate dredging on sedimentary communities, although not all evidence is directly applicable to every ecological group; • there is lack of detailed information on the physiological tolerances (e.g. to oxygenation, salinity, and temperature), habitat preferences, life history and population dynamics of many species, so that inferences has been made from related species, families, or even the same phylum; • there was inadequate evidence to assess the effects of non-indigenous species on most ecological groups, and Assessing the sensitivity of subtidal sedimentary habitats to pressures associated with human activities • there was inadequate evidence to assess the effects of electromagnetic fields and litter on any ecological group. The resultant report provides an up-to-date review of current knowledge about the effects of pressures resulting from human activities of circalittoral and offshore sedimentary communities. It provides an evidence base to facilitate and support the provision of management advice for Marine Protected Areas, development of UK marine monitoring and assessment, and conservation advice to offshore marine industries. However, such a review will require at least annual updates to take advantage of new evidence and new research as it becomes available. Also further work is required to test how ecological group assessments are best combined in practice to advise on the sensitivity of a range of sedimentary biotopes, including the 33 that were originally examined.
Resumo:
The article investigates the relationships between technological regimes and firm-level productivity performance, and it explores how such a relationship differs in different Schumpeterian patterns of innovation. The analysis makes use of a rich dataset containing data on innovation and other economic characteristics of a large representative sample of Norwegian firms in manufacturing and service industries for the period 1998–2004. First, we decompose TFP growth into technical progress and efficiency changes by means of data envelopment analysis. We then estimate an empirical model that relates these two productivity components to the characteristics of technological regimes and a set of other firm-specific factors. The results indicate that: (i) TFP growth has mainly been achieved through technical progress, while technical efficiency has on average decreased; (ii) the characteristics of technological regimes are important determinants of firm-level productivity growth, but their impacts on technical progress are different from the effects on efficiency change; (iii) the estimated model works differently in the two Schumpeterian regimes. Technical progress has been more dynamic in Schumpeter Mark II industries, while efficiency change has been more important in Schumpeter Mark I markets.
Resumo:
Bit level systolic array structures for computing sums of products are studied in detail. It is shown that these can be sub-divided into two classes and that, within each class, architectures can be described in terms of a set of constraint equations. It is further demonstrated that high performance system level functions with attractive VLSI properties can be constructed by matching data flow geometries in bit level and word level architectures.
Resumo:
Bit-level systolic-array structures for computing sums of products are studied in detail. It is shown that these can be subdivided into two classes and that within each class architectures can be described in terms of a set of constraint equations. It is further demonstrated that high-performance system-level functions with attractive VLSI properties can be constructed by matching data-flow geometries in bit-level and word-level architectures.
Resumo:
The use of bit-level systolic array circuits as building blocks in the construction of larger word-level systolic systems is investigated. It is shown that the overall structure and detailed timing of such systems may be derived quite simply using the dependence graph and cut-set procedure developed by S. Y. Kung (1988). This provides an attractive and intuitive approach to the bit-level design of many VLSI signal processing components. The technique can be applied to ripple-through and partly pipelined circuits as well as fully systolic designs. It therefore provides a means of examining the relative tradeoff between levels of pipelining, chip area, power consumption, and throughput rate within a given VLSI design.
Resumo:
We present BDDT, a task-parallel runtime system that dynamically discovers and resolves dependencies among parallel tasks. BDDT allows the programmer to specify detailed task footprints on any memory address range, multidimensional array tile or dynamic region. BDDT uses a block-based dependence analysis with arbitrary granularity. The analysis is applicable to existing C programs without having to restructure object or array allocation, and provides flexibility in array layouts and tile dimensions.
We evaluate BDDT using a representative set of benchmarks, and we compare it to SMPSs (the equivalent runtime system in StarSs) and OpenMP. BDDT performs comparable to or better than SMPSs and is able to cope with task granularity as much as one order of magnitude finer than SMPSs. Compared to OpenMP, BDDT performs up to 3.9× better for benchmarks that benefit from dynamic dependence analysis. BDDT provides additional data annotations to bypass dependence analysis. Using these annotations, BDDT outperforms OpenMP also in benchmarks where dependence analysis does not discover additional parallelism, thanks to a more efficient implementation of the runtime system.
Resumo:
Purpose: The purpose of this paper is to present an artificial neural network (ANN) model that predicts earthmoving trucks condition level using simple predictors; the model’s performance is compared to the respective predictive accuracy of the statistical method of discriminant analysis (DA).
Design/methodology/approach: An ANN-based predictive model is developed. The condition level predictors selected are the capacity, age, kilometers travelled and maintenance level. The relevant data set was provided by two Greek construction companies and includes the characteristics of 126 earthmoving trucks.
Findings: Data processing identifies a particularly strong connection of kilometers travelled and maintenance level with the earthmoving trucks condition level. Moreover, the validation process reveals that the predictive efficiency of the proposed ANN model is very high. Similar findings emerge from the application of DA to the same data set using the same predictors.
Originality/value: Earthmoving trucks’ sound condition level prediction reduces downtime and its adverse impact on earthmoving duration and cost, while also enhancing the maintenance and replacement policies effectiveness. This research proves that a sound condition level prediction for earthmoving trucks is achievable through the utilization of easy to collect data and provides a comparative evaluation of the results of two widely applied predictive methods.
Resumo:
The A-level Mathematics qualification is based on a compulsory set of pure maths modules and a selection of applied maths modules. The flexibility in choice of applied modules has led to concerns that many students would proceed to study engineering at university with little background in mechanics. A survey of aerospace and mechanical engineering students in our university revealed that a combination of mechanics and statistics (the basic module in both) was by far the most popular choice of optional modules in A-level Mathematics, meaning that only about one-quarter of the class had studied mechanics beyond the basic module within school mathematics. Investigation of student performance in two core, first-year engineering courses, which build on a mechanics foundation, indicated that any benefits for students who studied the extra mechanics at school were small. These results give concern about the depth of understanding in mechanics gained during A-level Mathematics.
Resumo:
This paper introduces hybrid address spaces as a fundamental design methodology for implementing scalable runtime systems on many-core architectures without hardware support for cache coherence. We use hybrid address spaces for an implementation of MapReduce, a programming model for large-scale data processing, and the implementation of a remote memory access (RMA) model. Both implementations are available on the Intel SCC and are portable to similar architectures. We present the design and implementation of HyMR, a MapReduce runtime system whereby different stages and the synchronization operations between them alternate between a distributed memory address space and a shared memory address space, to improve performance and scalability. We compare HyMR to a reference implementation and we find that HyMR improves performance by a factor of 1.71× over a set of representative MapReduce benchmarks. We also compare HyMR with Phoenix++, a state-of-art implementation for systems with hardware-managed cache coherence in terms of scalability and sustained to peak data processing bandwidth, where HyMR demon- strates improvements of a factor of 3.1× and 3.2× respectively. We further evaluate our hybrid remote memory access (HyRMA) programming model and assess its performance to be superior of that of message passing.