829 resultados para reliability of supply


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Mathematics in Defence 2011 Abstract. We review transreal arithmetic and present transcomplex arithmetic. These arithmetics have no exceptions. This leads to incremental improvements in computer hardware and software. For example, the range of real numbers, encoded by floating-point bits, is doubled when all of the Not-a-Number(NaN) states, in IEEE 754 arithmetic, are replaced with real numbers. The task of programming such systems is simplified and made safer by discarding the unordered relational operator,leaving only the operators less-than, equal-to, and greater than. The advantages of using a transarithmetic in a computation, or transcomputation as we prefer to call it, may be had by making small changes to compilers and processor designs. However, radical change is possible by exploiting the reliability of transcomputations to make pipelined dataflow machines with a large number of cores. Our initial designs are for a machine with order one million cores. Such a machine can complete the execution of multiple in-line programs each clock tick

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It is increasingly important to know about when energy is used in the home, at work and on the move. Issues of time and timing have not featured strongly in energy policy analysis and in modelling, much of which has focused on estimating and reducing total average annual demand per capita. If smarter ways of balancing supply and demand are to take hold, and if we are to make better use of decarbonised forms of supply, it is essential to understand and intervene in patterns of societal synchronisation. This calls for detailed knowledge of when, and on what occasions many people engage in the same activities at the same time, of how such patterns are changing, and of how might they be shaped. In addition, the impact of smart meters and controls partly depends on whether there is, in fact scope for shifting the timing of what people do, and for changing the rhythm of the day. Is the scheduling of daily life an arena that policy can influence, and if so how? The DEMAND Centre has been linking time use, energy consumption and travel diary data as a means of addressing these questions and in this working paper we present some of the issues and results arising from that exercise.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Palaeodata in synthesis form are needed as benchmarks for the Palaeoclimate Modelling Intercomparison Project (PMIP). Advances since the last synthesis of terrestrial palaeodata from the last glacial maximum (LGM) call for a new evaluation, especially of data from the tropics. Here pollen, plant-macrofossil, lake-level, noble gas (from groundwater) and δ18O (from speleothems) data are compiled for 18±2 ka (14C), 32 °N–33 °S. The reliability of the data was evaluated using explicit criteria and some types of data were re-analysed using consistent methods in order to derive a set of mutually consistent palaeoclimate estimates of mean temperature of the coldest month (MTCO), mean annual temperature (MAT), plant available moisture (PAM) and runoff (P-E). Cold-month temperature (MAT) anomalies from plant data range from −1 to −2 K near sea level in Indonesia and the S Pacific, through −6 to −8 K at many high-elevation sites to −8 to −15 K in S China and the SE USA. MAT anomalies from groundwater or speleothems seem more uniform (−4 to −6 K), but the data are as yet sparse; a clear divergence between MAT and cold-month estimates from the same region is seen only in the SE USA, where cold-air advection is expected to have enhanced cooling in winter. Regression of all cold-month anomalies against site elevation yielded an estimated average cooling of −2.5 to −3 K at modern sea level, increasing to ≈−6 K by 3000 m. However, Neotropical sites showed larger than the average sea-level cooling (−5 to −6 K) and a non-significant elevation effect, whereas W and S Pacific sites showed much less sea-level cooling (−1 K) and a stronger elevation effect. These findings support the inference that tropical sea-surface temperatures (SSTs) were lower than the CLIMAP estimates, but they limit the plausible average tropical sea-surface cooling, and they support the existence of CLIMAP-like geographic patterns in SST anomalies. Trends of PAM and lake levels indicate wet LGM conditions in the W USA, and at the highest elevations, with generally dry conditions elsewhere. These results suggest a colder-than-present ocean surface producing a weaker hydrological cycle, more arid continents, and arguably steeper-than-present terrestrial lapse rates. Such linkages are supported by recent observations on freezing-level height and tropical SSTs; moreover, simulations of “greenhouse” and LGM climates point to several possible feedback processes by which low-level temperature anomalies might be amplified aloft.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Event-related desynchronization/synchronization (ERD/ERS) is a relative power decrease/increase of electroencephalogram (EEG) in a specific frequency band during physical motor execution and mental motor imagery, thus it is widely used for the brain-computer interface (BCI) purpose. However what the ERD really reflects and its frequency band specific role have not been agreed and are under investigation. Understanding the underlying mechanism which causes a significant ERD would be crucial to improve the reliability of the ERD-based BCI. We systematically investigated the relationship between conditions of actual repetitive hand movements and resulting ERD. Methods Eleven healthy young participants were asked to close/open their right hand repetitively at three different speeds (Hold, 1/3 Hz, and 1 Hz) and four distinct motor loads (0, 2, 10, and 15 kgf). In each condition, participants repeated 20 experimental trials, each of which consisted of rest (8–10 s), preparation (1 s) and task (6 s) periods. Under the Hold condition, participants were instructed to keep clenching their hand (i.e., isometric contraction) during the task period. Throughout the experiment, EEG signals were recorded from left and right motor areas for offline data analysis. We obtained time courses of EEG power spectrum to discuss the modulation of mu and beta-ERD/ERS due to the task conditions. Results We confirmed salient mu-ERD (8–13 Hz) and slightly weak beta-ERD (14–30 Hz) on both hemispheres during repetitive hand grasping movements. According to a 3 × 4 ANOVA (speed × motor load), both mu and beta-ERD during the task period were significantly weakened under the Hold condition, whereas no significant difference in the kinetics levels and interaction effect was observed. Conclusions This study investigates the effect of changes in kinematics and kinetics on resulting ERD during repetitive hand grasping movements. The experimental results suggest that the strength of ERD may reflect the time differentiation of hand postures in motor planning process or the variation of proprioception resulting from hand movements, rather than the motor command generated in the down stream, which recruits a group of motor neurons.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Integrating renewable energy into built environments requires additional attention to the balancing of supply and demand due to their intermittent nature. Demand Side Response (DSR) has the potential to make money for organisations as well as support the System Operator as the generation mix changes. There is an opportunity to increase the use of existing technologies in order to manage demand. Company-owned standby generators are a rarely used resource; their maintenance schedule often accounts for a majority of their running hours. DSR encompasses a range of technologies and organisations; Sustainability First (2012) suggest that the System Operator (SO), energy supply companies, Distribution Network Operators (DNOs), Aggregators and Customers all stand to benefit from DSR. It is therefore important to consider impact of DSR measures to each of these stakeholders. This paper assesses the financial implications of organisations using existing standby generation equipment for DSR in order to avoid peak electricity charges. It concludes that under the current GB electricity pricing structure, there are several regions where running diesel generators at peak times is financially beneficial to organisations. Issues such as fuel costs, Carbon Reduction Commitment (CRC) charges, maintenance costs and electricity prices are discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this paper is to develop a comprehensive taxonomy of green supply chain management (GSCM) practices and develop a structural equation modelling-driven decision support system following GSCM taxonomy for managers to provide better understanding of the complex relationship between the external and internal factors and GSCM operational practices. Typology and/or taxonomy play a key role in the development of social science theories. The current taxonomies focus on a single or limited component of the supply chain. Furthermore, they have not been tested using different sample compositions and contexts, yet replication is a prerequisite for developing robust concepts and theories. In this paper, we empirically replicate one such taxonomy extending the original study by (a) developing broad (containing the key components of supply chain) taxonomy; (b) broadening the sample by including a wider range of sectors and organisational size; and (c) broadening the geographic scope of the previous studies. Moreover, we include both objective measures and subjective attitudinal measurements. We use a robust two-stage cluster analysis to develop our GSCM taxonomy. The main finding validates the taxonomy previously proposed and identifies size, attitude and level of environmental risk and impact as key mediators between internal drivers, external drivers and GSCM operational practices.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The UK new-build housing sector is facing dual pressures to expand supply, whilst delivering against tougher planning and Building Regulation requirements; predominantly in the areas of sustainability. The sector is currently responding by significantly scaling up production and incorporating new technical solutions into new homes. This trajectory of up-scaling and technical innovation has been of research interest; but this research has primarily focus on the ‘upstream’ implications for house builders’ business models and standardised design templates. There has been little attention, though, to the potential ‘downstream’ implications of the ramping up of supply and the introduction of new technologies for build quality and defects. This paper contributes to our understanding of the ‘downstream’ implications through a synthesis of the current UK defect literature with respect to new-build housing. It is found that the prevailing emphasis in the literature is limited to the responsibility, pathology and statistical analysis of defects (and failures). The literature does not extend to how house builders individually and collectively, in practice, collect and learn from defects information. The paper concludes by describing an ongoing collaborative research programme with the National House Building Council (NHBC) to: (a) understand house builders’ localised defects analysis procedures, and their current knowledge feedback loops to inform risk management strategies; and, (b) building on this understanding, design and test action research interventions to develop new data capture, learning processes and systems to reduce targeted defects.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Despite the generally positive contribution of supply management capabilities to firm performance their respective routines require more depth of assessment. Using the resource-based view we examine four routines bundles comprising ostensive and performative aspects of supply management capability – supply management integration, coordinated sourcing, collaboration management and performance assessment. Using structural equation modelling we measure supply management capability empirically as a second-order latent variable and estimate its effect on a series of financial and operational performance measures. The routines-based approach allows us to demonstrate a different, more fine-grained approach for assessing consistent bundles of homogeneous patterns of activity across firms. The results suggest supply management capability is formed of internally consistent routine bundles, which are significantly related to financial performance, mediated by operational performance. Our results confirm an indirect effect of firm performance for ‘core’ routines forming the architecture of a supply management capability. Supply management capability primarily improves the operational performance of the business, which is subsequently translated into improved financial performance. The study is significant for practice as it offers a different view about the face-valid rationale of supply management directly influencing firm financial performance. We confound this assumption, prompting caution when placing too much importance on directly assessing supply management capability using financial performance of the business.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study investigates the relationship between the wind wave climate and the main climate modes of atmospheric variability in the North Atlantic Ocean. The modes considered are the North Atlantic Oscillation (NAO), the East Atlantic (EA) pattern, the East Atlantic Western Russian (EA/WR) pattern and the Scandinavian (SCAN) pattern. The wave dataset consists of buoys records, remote sensing altimetry observations and a numerical hindcast providing significant wave height (SWH), mean wave period (MWP) and mean wave direction (MWD) for the period 1989–2009. After evaluating the reliability of the hindcast, we focus on the impact of each mode on seasonal wave parameters and on the relative importance of wind-sea and swell components. Results demonstrate that the NAO and EA patterns are the most relevant, whereas EA/WR and SCAN patterns have a weaker impact on the North Atlantic wave climate variability. During their positive phases, both NAO and EA patterns are related to winter SWH at a rate that reaches 1 m per unit index along the Scottish coast (NAO) and Iberian coast (EA) patterns. In terms of winter MWD, the two modes induce a counterclockwise shift of up to 65° per negative NAO (positive EA) unit over west European coasts. They also increase the winter MWP in the North Sea and in the Bay of Biscay (up to 1 s per unit NAO) and along the western coasts of Europe and North Africa (1 s per unit EA). The impact of winter EA pattern on all wave parameters is mostly caused through the swell wave component.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Using lessons from idealised predictability experiments, we discuss some issues and perspectives on the design of operational seasonal to inter-annual Arctic sea-ice prediction systems. We first review the opportunities to use a hierarchy of different types of experiment to learn about the predictability of Arctic climate. We also examine key issues for ensemble system design, such as: measuring skill, the role of ensemble size and generation of ensemble members. When assessing the potential skill of a set of prediction experiments, using more than one metric is essential as different choices can significantly alter conclusions about the presence or lack of skill. We find that increasing both the number of hindcasts and ensemble size is important for reliably assessing the correlation and expected error in forecasts. For other metrics, such as dispersion, increasing ensemble size is most important. Probabilistic measures of skill can also provide useful information about the reliability of forecasts. In addition, various methods for generating the different ensemble members are tested. The range of techniques can produce surprisingly different ensemble spread characteristics. The lessons learnt should help inform the design of future operational prediction systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Quantifying the effect of the seawater density changes on sea level variability is of crucial importance for climate change studies, as the sea level cumulative rise can be regarded as both an important climate change indicator and a possible danger for human activities in coastal areas. In this work, as part of the Ocean Reanalysis Intercomparison Project, the global and regional steric sea level changes are estimated and compared from an ensemble of 16 ocean reanalyses and 4 objective analyses. These estimates are initially compared with a satellite-derived (altimetry minus gravimetry) dataset for a short period (2003–2010). The ensemble mean exhibits a significant high correlation at both global and regional scale, and the ensemble of ocean reanalyses outperforms that of objective analyses, in particular in the Southern Ocean. The reanalysis ensemble mean thus represents a valuable tool for further analyses, although large uncertainties remain for the inter-annual trends. Within the extended intercomparison period that spans the altimetry era (1993–2010), we find that the ensemble of reanalyses and objective analyses are in good agreement, and both detect a trend of the global steric sea level of 1.0 and 1.1 ± 0.05 mm/year, respectively. However, the spread among the products of the halosteric component trend exceeds the mean trend itself, questioning the reliability of its estimate. This is related to the scarcity of salinity observations before the Argo era. Furthermore, the impact of deep ocean layers is non-negligible on the steric sea level variability (22 and 12 % for the layers below 700 and 1500 m of depth, respectively), although the small deep ocean trends are not significant with respect to the products spread.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study investigated the contribution of stereoscopic depth cues to the reliability of ordinal depth judgments in complex natural scenes. Participants viewed photographs of cluttered natural scenes, either monocularly or stereoscopically. On each trial, they judged which of two indicated points in the scene was closer in depth. We assessed the reliability of these judgments over repeated trials, and how well they correlated with the actual disparities of the points between the left and right eyes' views. The reliability of judgments increased as their depth separation increased, was higher when the points were on separate objects, and deteriorated for point pairs that were more widely separated in the image plane. Stereoscopic viewing improved sensitivity to depth for points on the same surface, but not for points on separate objects. Stereoscopic viewing thus provides depth information that is complementary to that available from monocular occlusion cues.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Experimental philosophy of language uses experimental methods developed in the cognitive sciences to investigate topics of interest to philosophers of language. This article describes the methodological background for the development of experimental approaches to topics in philosophy of language, distinguishes negative and positive projects in experimental philosophy of language, and evaluates experimental work on the reference of proper names and natural kind terms. The reliability of expert judgments vs. the judgments of ordinary speakers, the role that ambiguity plays in influencing responses to experiments, and the reliability of meta-linguistic judgments are also assessed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We studied superclusters of galaxies in a volume-limited sample extracted from the Sloan Digital Sky Survey Data Release 7 and from mock catalogues based on a semi-analytical model of galaxy evolution in the Millennium Simulation. A density field method was applied to a sample of galaxies brighter than M(r) = -21+5 log h(100) to identify superclusters, taking into account selection and boundary effects. In order to evaluate the influence of the threshold density, we have chosen two thresholds: the first maximizes the number of objects (D1) and the second constrains the maximum supercluster size to similar to 120 h(-1) Mpc (D2). We have performed a morphological analysis, using Minkowski Functionals, based on a parameter, which increases monotonically from filaments to pancakes. An anticorrelation was found between supercluster richness (and total luminosity or size) and the morphological parameter, indicating that filamentary structures tend to be richer, larger and more luminous than pancakes in both observed and mock catalogues. We have also used the mock samples to compare supercluster morphologies identified in position and velocity spaces, concluding that our morphological classification is not biased by the peculiar velocities. Monte Carlo simulations designed to investigate the reliability of our results with respect to random fluctuations show that these results are robust. Our analysis indicates that filaments and pancakes present different luminosity and size distributions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The region of Toledo River, Parana, Brazil is characterized by intense anthropogenic activities. Hence, metal concentrations and physical-chemical parameters of Toledo River water were determined in order to complete an environmental evaluation catalog. Samples were collected monthly during one year period at seven different sites from the source down the river mouth, physical-chemical variables were analyzed, and major metallic ions were measured. Metal analysis was performed by using the synchrotron radiation total reflection X-ray fluorescence technique. A statistical analysis was applied to evaluate the reliability of experimental data. The analysis of obtained results have shown that a strong correlation between physical-chemical parameters existed among sites 1 and 7, suggesting that organic pollutants were mainly responsible for decreasing the Toledo River water quality.