850 resultados para Power tool industry
Resumo:
For over 20 years, computer reservation systems (CRSs) have been cited as a source of competitive advantage for the airlines that developed them. This paper reviews the developments that have given rise to such advantage, emphasising the use of CRS-generated information as a competitive tool within the airline industry. New evidence is presented suggesting that the control, dissemination and manipulation of CRS data by owning airlines continued to allow them to capitalise on their investment at the expense of competitors during the 1990s.
Resumo:
Numerous problems exist that can be modeled as traffic through a network in which constraints exist to regulate flow. Vehicular road travel, computer networks, and cloud based resource distribution, among others all have natural representations in this manner. As these networks grow in size and/or complexity, analysis and certification of the safety invariants becomes increasingly costly. The NetSketch formalism introduces a lightweight verification framework that allows for greater scalability than traditional analysis methods. The NetSketch tool was developed to provide the power of this formalism in an easy to use and intuitive user interface.
Resumo:
NetSketch is a tool that enables the specification of network-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. As a modeling tool, it enables the abstraction of an existing system so as to retain sufficient enough details to enable future analysis of safety properties. As a design tool, NetSketch enables the exploration of alternative safe designs as well as the identification of minimal requirements for outsourced subsystems. NetSketch embodies a lightweight formal verification philosophy, whereby the power (but not the heavy machinery) of a rigorous formalism is made accessible to users via a friendly interface. NetSketch does so by exposing tradeoffs between exactness of analysis and scalability, and by combining traditional whole-system analysis with a more flexible compositional analysis approach based on a strongly-typed, Domain-Specific Language (DSL) to specify network configurations at various levels of sketchiness along with invariants that need to be enforced thereupon. In this paper, we overview NetSketch, highlight its salient features, and illustrate how it could be used in applications, including the management/shaping of traffic flows in a vehicular network (as a proxy for CPS applications) and in a streaming media network (as a proxy for Internet applications). In a companion paper, we define the formal system underlying the operation of NetSketch, in particular the DSL behind NetSketch's user-interface when used in "sketch mode", and prove its soundness relative to appropriately-defined notions of validity.
Resumo:
Recent empirical studies have shown that Internet topologies exhibit power laws of the form for the following relationships: (P1) outdegree of node (domain or router) versus rank; (P2) number of nodes versus outdegree; (P3) number of node pairs y = x^α within a neighborhood versus neighborhood size (in hops); and (P4) eigenvalues of the adjacency matrix versus rank. However, causes for the appearance of such power laws have not been convincingly given. In this paper, we examine four factors in the formation of Internet topologies. These factors are (F1) preferential connectivity of a new node to existing nodes; (F2) incremental growth of the network; (F3) distribution of nodes in space; and (F4) locality of edge connections. In synthetically generated network topologies, we study the relevance of each factor in causing the aforementioned power laws as well as other properties, namely diameter, average path length and clustering coefficient. Different kinds of network topologies are generated: (T1) topologies generated using our parametrized generator, we call BRITE; (T2) random topologies generated using the well-known Waxman model; (T3) Transit-Stub topologies generated using GT-ITM tool; and (T4) regular grid topologies. We observe that some generated topologies may not obey power laws P1 and P2. Thus, the existence of these power laws can be used to validate the accuracy of a given tool in generating representative Internet topologies. Power laws P3 and P4 were observed in nearly all considered topologies, but different topologies showed different values of the power exponent α. Thus, while the presence of power laws P3 and P4 do not give strong evidence for the representativeness of a generated topology, the value of α in P3 and P4 can be used as a litmus test for the representativeness of a generated topology. We also find that factors F1 and F2 are the key contributors in our study which provide the resemblance of our generated topologies to that of the Internet.
Resumo:
Great demand in power optimized devices shows promising economic potential and draws lots of attention in industry and research area. Due to the continuously shrinking CMOS process, not only dynamic power but also static power has emerged as a big concern in power reduction. Other than power optimization, average-case power estimation is quite significant for power budget allocation but also challenging in terms of time and effort. In this thesis, we will introduce a methodology to support modular quantitative analysis in order to estimate average power of circuits, on the basis of two concepts named Random Bag Preserving and Linear Compositionality. It can shorten simulation time and sustain high accuracy, resulting in increasing the feasibility of power estimation of big systems. For power saving, firstly, we take advantages of the low power characteristic of adiabatic logic and asynchronous logic to achieve ultra-low dynamic and static power. We will propose two memory cells, which could run in adiabatic and non-adiabatic mode. About 90% dynamic power can be saved in adiabatic mode when compared to other up-to-date designs. About 90% leakage power is saved. Secondly, a novel logic, named Asynchronous Charge Sharing Logic (ACSL), will be introduced. The realization of completion detection is simplified considerably. Not just the power reduction improvement, ACSL brings another promising feature in average power estimation called data-independency where this characteristic would make power estimation effortless and be meaningful for modular quantitative average case analysis. Finally, a new asynchronous Arithmetic Logic Unit (ALU) with a ripple carry adder implemented using the logically reversible/bidirectional characteristic exhibiting ultra-low power dissipation with sub-threshold region operating point will be presented. The proposed adder is able to operate multi-functionally.
Resumo:
Fungal spoilage is the most common type of microbial spoilage in food leading to significant economical and health problems throughout the world. Fermentation by lactic acid bacteria (LAB) is one of the oldest and most economical methods of producing and preserving food. Thus, LAB can be seen as an interesting tool in the development of novel bio-preservatives for food industry. The overall objective of this study was to demonstrate, that LAB can be used as a natural way to improve the shelf-life and safety of a wide range of food products. In the first part of the thesis, 116 LAB isolates were screened for their antifungal activity against four Aspergillus and Penicillium spp. commonly found in food. Approximately 83% of them showed antifungal activity, but only 1% showed a broad range antifungal activity against all tested fungi. The second approach was to apply LAB antifungal strains in production of food products with extended shelf-life. L. reuteri R29 strain was identified as having strong antifungal activity in vitro, as well as in sourdough bread against Aspergillus niger, Fusarium culmorum and Penicillium expansum. The ability of the strain to produce bread of good quality was also determined using standard baking tests. Another strain, L. amylovorus DSM19280, was also identified as having strong antifungal activity in vitro and in vivo. The strain was used as an adjunct culture in a Cheddar cheese model system and demonstrated the inhibition of P. expansum. Significantly, its presence had no detectable negative impact on cheese quality as determined by analysis of moisture, salt, pH, and primary and secondary proteolysis. L. brevis PS1 a further strain identified during the screening as very antifungal, showed activity in vitro against common Fusarium spp. and was used in the production of a novel functional wortbased alcohol-free beverage. Challenge tests performed with F. culmorum confirmed the effectiveness of the antifungal strain in vivo. The shelf-life of the beverage was extended significantly when compared to not inoculated wort sample. A range of antifungal compounds were identified for the 4 LAB strains, namely L. reuteri ee1p, L. reuteri R29, L. brevis PS1 and L. amylovorous DSM20531. The identification of the compounds was based on liquid chromatography interfaced to the mass spectrometer and PDA detector
Cost savings from relaxation of operational constraints on a power system with high wind penetration
Resumo:
Wind energy is predominantly a nonsynchronous generation source. Large-scale integration of wind generation with existing electricity systems, therefore, presents challenges in maintaining system frequency stability and local voltage stability. Transmission system operators have implemented system operational constraints (SOCs) in order to maintain stability with high wind generation, but imposition of these constraints results in higher operating costs. A mixed integer programming tool was used to simulate generator dispatch in order to assess the impact of various SOCs on generation costs. Interleaved day-ahead scheduling and real-time dispatch models were developed to allow accurate representation of forced outages and wind forecast errors, and were applied to the proposed Irish power system of 2020 with a wind penetration of 32%. Savings of at least 7.8% in generation costs and reductions in wind curtailment of 50% were identified when the most influential SOCs were relaxed. The results also illustrate the need to relax local SOCs together with the system-wide nonsynchronous penetration limit SOC, as savings from increasing the nonsynchronous limit beyond 70% were restricted without relaxation of local SOCs. The methodology and results allow for quantification of the costs of SOCs, allowing the optimal upgrade path for generation and transmission infrastructure to be determined.
Resumo:
This research aims to communicate new results of empirical investigations to learn about the relationship between determination of controlling an acquired firm’s capital, assets and brand versus its capability of innovation and ex post performance of the rising Vietnamese M&A industry in the 2005-2012 period. The analysis employs a categorical data sample, consisting of 212 M&A cases reported by various information sources, and performs a number of logistic regressions with significant results as follows. Firstly, the overall relationship between pre-M&A pursuit’s determination on acquiring resources and performance of the post-M&A performance is found significant. There exist profound effects of a ‘size matters’ strategy in M&A ex post performance. When there is an overwhelming ‘resources acquiring’ strategy, the innovation factor’s explanatory power becomes negligible. Secondly, for negative performance of post-M&A operations, the emphasis on both capital base and asset size, and the brand value at the time of the M&A pursuit is the major explanation in the post-M&A period. So does the absence of innovation as a goal in the pre-M&A period. These two insights together are useful in careful M&A planning. Lastly, expensive pre-M&A expenditures tend to adversely affect the post-M&A performance. As a general conclusion, this study shows that innovation can be an important factor to pursue in M&A transitions, together with the need to emphasize and find capable and willing human capital, rather than a capital base (equity or debt) and existing values of the acquired brands.
Resumo:
What is the relationship between the design of regulations and levels of individual compliance? To answer this question, Crawford and Ostrom's institutional grammar tool is used to deconstruct regulations governing the aquaculture industry in Colorado, USA. Compliance with the deconstructed regulatory components is then assessed based on the perceptions of the appropriateness of the regulations, involvement in designing the regulations, and intrinsic and extrinsic motivations. The findings suggest that levels of compliance with regulations vary across and within individuals regarding various aspects of the regulatory components. As expected, the level of compliance is affected by the perceived appropriateness of regulations, participation in designing the regulations, and feelings of guilt and fear of social disapproval. Furthermore, there is a strong degree of interdependence among the written components, as identified by the institutional grammar tool, in affecting compliance levels. The paper contributes to the regulation and compliance literature by illustrating the utility of the institutional grammar tool in understanding regulatory content, applying a new Q-Sort technique for measuring individual levels of compliance, and providing a rare exploration into feelings of guilt and fear outside of the laboratory setting. © 2012 Blackwell Publishing Asia Pty Ltd.
Resumo:
Over a time span of almost a decade, the FUELCON project in nuclear engineering has led to a fully functional expert system and spawned sequel projects. Its task is in-core fuel management, also called `refueling', i.e., good fuel-allocation for reloading the core of a given nuclear reactor, for a given operation cycle. The task is crucial for keeping down operation costs at nuclear power plants. Fuel comes in different types and is positioned in a grid representing the core of a reactor. The tool is useful for practitioners but also helps the expert in the domain to test his or her rules of thumb and to discover new ones.
Resumo:
In this book, expert energy economists assess the energy policy of thirty-one countries and the role of nuclear power. For many years the shock of Chernobyl took nuclear power off the agenda in most countries. Intense public relations activities by the industry, increasing evidence of climate change and failures to effectively reduce greenhouse gas emissions, have brought nuclear power issues back to the forefront of policy discussion in the nuclear renaissance countries. But some countries are just not prepared to go in that direction and, indeed, are still divesting themselves of their nuclear legacy, the nuclear phase-out countries. And how are nuclear issues being approached in the industrializing countries? An in-depth country-by-country analysis is presented within this framework. Out of such an analysis emerge thematic discussions on, among others, strategy in energy policy; nuclear plant safety, the impacts of nuclear accidents; the adequacy of nuclear power expertise. [Source: publisher's product description].
Resumo:
Although some countries plan to build new nuclear power plants in the near future, in aggregate the data indicates that nuclear power's influence will continue to dwindle across the globe in coming decades.
Resumo:
Environmental activism has a long history in protest, addressing issues of degradation and segregation that threaten existing ecologies, social and built fabrics. Environmental activism is traditionally understood as a reaction, chiefly by groups of people, against a perceived external threat. In the 60’s and 70’s, an activist stance began to emerge in the work of some artists and architects, who used creative methods such as performances, happenings, temporary spatial interventions etc to convey their political/aesthetic messages. Some of this work engaged directly with communities but predominantly it was the production of one individual working ‘outside’ society. However such actions demonstrated not only the power of the visual in conveying a political message but also the potential of conceptual creative approaches to reveal alternative values and hidden potentials. This marked a shift from activism as protestation towards an activism of reconceptualisation. Recently, activist groups have developed a more politically informed process. Whilst their ‘tools’ may resemble work from the 60’s and 70’s , their methodologies are non-traditional, ’rhizomatic’, pedagogical and fluid; working alongside, rather than against, the established power and funding structures. Such creative processes build new, often unexpected, stakeholder networks; offer neutral spaces in which contentious issues can be faced; and create better understanding of values and identities. They can also lead to permanent improvements and development in the physical fabric. This paper will discuss a pedagogical example of activism in architectural education. The event (www.fourdaysontheoutside.com) is in its fifth year of existence and as such has revealed a value and impulse beyond its learning and teaching value. The paper will discuss how the event contributes to the university’s outreach programme and how its structure acts as a seedbed for potential research projects and partnerships. UK Universities talk extensively about applied research but have few actual strategies by which to generate it. Fourdaysontheoutside offers some potential ways forward.
Resumo:
Dual-rail encoding, return-to-spacer protocol, and hazard-free logic can be used to resist power analysis attacks by making energy consumed per clock cycle independent of processed data. Standard dual-rail logic uses a protocol with a single spacer, e.g., all-zeros, which gives rise to energy balancing problems. We address these problems by incorporating two spacers; the spacers alternate between adjacent clock cycles. This guarantees that all gates switch in every clock cycle regardless of the transmitted data values. To generate these dual-rail circuits, an automated tool has been developed. It is capable of converting synchronous netlists into dual-rail circuits and it is interfaced to industry CAD tools. Dual-rail and single-rail benchmarks based upon the advanced encryption standard (AES) have been simulated and compared in order to evaluate the method and the tool.
Resumo:
Introduction Product standardisation involves promoting the prescribing of pre-selected products within a particular category across a healthcare region and is designed to improve patient safety by promoting continuity of medicine use across the primary/secondary care interface, in addition to cost containment without compromising clinical care (i.e. maintaining safety and efficacy). Objectives To examine the impact of product standardisation on the prescribing of compound alginate preparations within primary care in Northern Ireland. Methods Data were obtained on alginate prescribing from the Northern Ireland Central Services Agency (Prescription Pricing Branch), covering a period of 43 months. Two standardisation promotion interventions were carried out at months 18 and 33. In addition to conventional statistical analyses, a simple interrupted time series analysis approach, using graphical interpretation, was used to facilitate interpretation of the data. Results There was a significant increase in the prescribed share of the preferred alginate product in each of the four health boards in Northern Ireland and a decrease in the cost per Defined Daily Dose for alginate liquid preparations overall. Compliance with the standardisation policy was, however, incomplete and was influenced to a marked degree by the activities of the pharmaceutical industry. The overall economic impact of the prescribing changes during the study was small (3.1%). Conclusion The findings suggested that product standardisation significantly influenced the prescribing pattern for compound alginate liquid preparations within primary care across Northern Ireland. © 2012 The Authors. IJPP © 2012 Royal Pharmaceutical Society.