933 resultados para Functions of real variables
Resumo:
This thesis examines the parameters associated with the failure of triangular steel gusset plates. Eighty two results are presented investigating the position of the applied load, the gusset plate height to length ratio, size, thickness, internal angle and the removal of the inside corner. The thickness of the loaded plate and its influence on the gusset plates failure is also investigated. Twenty two similar gusset plates were tested to investigate the welds connecting the gusset plate to the adjacent loaded and support edges. The experimental results are compared with existing methods, none of which cover all the variables tested. Some methods do not consider buckling and most of those that do are inadequate. Most of the methods do not accurately take account of the load position. An alternative method based on experimental observations is presented for design purposes. The method covers any combination of the variables tested. To test assumptions made in the theoretical work forty seven strut tests took place to investigate buckling characteristics and fifteen special gusset plates were also tested. The gusset plates were found to fail in an elastic-plastic buckling manner. A gusset plate has a specific moment of resistance capacity about it’s inside corner and the ultimate load that can be applied is dependent upon the position of the load relative to the supported edge. There is an optimum height to length ratio for strength and any increase in the internal angle from 90 degrees produces little change in moment capacity. The removal of small portions of the inside corner of a gusset plate has little effect upon its moment capacity. The loaded plate does not provide any significant moment of resistance to the applied load at failure. The main functions of the loaded and supported edge welds is to prevent the gusset plate from slipping from between the plates.
Resumo:
Warehouse is an essential component in the supply chain, linking the chain partners and providing them with functions of product storage, inbound and outbound operations along with value-added processes. Allocation of warehouse resources should be efficient and effective to achieve optimum productivity and reduce operational costs. Radio frequency identification (RFID) is a technology capable of providing real-time information about supply chain operations. It has been used by warehousing and logistic enterprises to achieve reduced shrinkage, improved material handling and tracking as well as increased accuracy of data collection. However, both academics and practitioners express concerns about challenges to RFID adoption in the supply chain. This paper provides a comprehensive analysis of the problems encountered in RFID implementation at warehouses, discussing the theoretical and practical adoption barriers and causes of not achieving full potential of the technology. Lack of foreseeable return on investment (ROI) and high costs are the most commonly reported obstacles. Variety of standards and radio wave frequencies are identified as source of concern for decision makers. Inaccurate performance of the RFID within the warehouse environment is examined. Description of integration challenges between warehouse management system and RFID technology is given. The paper discusses the existing solutions to technological, investment and performance RFID adoption barriers. Factors to consider when implementing the RFID technology are given to help alleviate implementation problems. By illustrating the challenges of RFID in the warehouse environment and discussing possible solutions the paper aims to help both academics and practitioners to focus on key areas constituting an obstacle to the technology growth. As more studies will address these challenges, the realisation of RFID benefits for warehouses and supply chain will become a reality.
Resumo:
Inference and optimisation of real-value edge variables in sparse graphs are studied using the tree based Bethe approximation optimisation algorithms. Equilibrium states of general energy functions involving a large set of real edge-variables that interact at the network nodes are obtained for networks in various cases. These include different cost functions, connectivity values, constraints on the edge bandwidth and the case of multiclass optimisation.
Resumo:
This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.
Resumo:
This thesis begins with a review of the literature on team-based working in organisations, highlighting the variations in research findings, and the need for greater precision in our measurement of teams. It continues with an illustration of the nature and prevalence of real and pseudo team-based working, by presenting results from a large sample of secondary data from the UK National Health Service. Results demonstrate that ‘real teams’ have an important and significant impact on the reduction of many work-related safety outcomes. Based on both theoretical and methodological limitations of existing approaches, the thesis moves on to provide a clarification and extension of the ‘real team’ construct, demarcating this from other (pseudo-like) team typologies on a sliding scale, rather than a simple dichotomy. A conceptual model for defining real teams is presented, providing a theoretical basis for the development of a scale on which teams can be measured for varying extents of ‘realness’. A new twelve-item scale is developed and tested with three samples of data comprising 53 undergraduate teams, 52 postgraduate teams, and 63 public sector teams from a large UK organisation. Evidence for the content, construct and criterion-related validity of the real team scale is examined over seven separate validation studies. Theoretical, methodological and practical implications of the real team scale are then discussed.
Resumo:
This paper contributes to the debate on the role of real options theory in business strategy and organizational decision-making. It analyses and critiques the decision-making and performance implications of real options within the management theories of the (multinational) firm, reviews and categorizes the organizational, strategic and operational facets of real options management in large business settings. It also presents the views of scholars and practitioners regarding the incorporation and validity of real options in strategy, international management and business processes. The focus is particularly on the decision-making and performance attributes of the real options logic concerning strategic investments, governance modes and multinational operations management. These attributes are examined from both strategic and operating perspectives of decision-making in organizations, also with an overview of the empirical evidence on real options decision-making and performance.
Resumo:
Requirements-aware systems address the need to reason about uncertainty at runtime to support adaptation decisions, by representing quality of services (QoS) requirements for service-based systems (SBS) with precise values in run-time queryable model specification. However, current approaches do not support updating of the specification to reflect changes in the service market, like newly available services or improved QoS of existing ones. Thus, even if the specification models reflect design-time acceptable requirements they may become obsolete and miss opportunities for system improvement by self-adaptation. This articles proposes to distinguish "abstract" and "concrete" specification models: the former consists of linguistic variables (e.g. "fast") agreed upon at design time, and the latter consists of precise numeric values (e.g. "2ms") that are dynamically calculated at run-time, thus incorporating up-to-date QoS information. If and when freshly calculated concrete specifications are not satisfied anymore by the current service configuration, an adaptation is triggered. The approach was validated using four simulated SBS that use services from a previously published, real-world dataset; in all cases, the system was able to detect unsatisfied requirements at run-time and trigger suitable adaptations. Ongoing work focuses on policies to determine recalculation of specifications. This approach will allow engineers to build SBS that can be protected against market-caused obsolescence of their requirements specifications. © 2012 IEEE.
Resumo:
The focus of our work is the verification of tight functional properties of numerical programs, such as showing that a floating-point implementation of Riemann integration computes a close approximation of the exact integral. Programmers and engineers writing such programs will benefit from verification tools that support an expressive specification language and that are highly automated. Our work provides a new method for verification of numerical software, supporting a substantially more expressive language for specifications than other publicly available automated tools. The additional expressivity in the specification language is provided by two constructs. First, the specification can feature inclusions between interval arithmetic expressions. Second, the integral operator from classical analysis can be used in the specifications, where the integration bounds can be arbitrary expressions over real variables. To support our claim of expressivity, we outline the verification of four example programs, including the integration example mentioned earlier. A key component of our method is an algorithm for proving numerical theorems. This algorithm is based on automatic polynomial approximation of non-linear real and real-interval functions defined by expressions. The PolyPaver tool is our implementation of the algorithm and its source code is publicly available. In this paper we report on experiments using PolyPaver that indicate that the additional expressivity does not come at a performance cost when comparing with other publicly available state-of-the-art provers. We also include a scalability study that explores the limits of PolyPaver in proving tight functional specifications of progressively larger randomly generated programs. © 2014 Springer International Publishing Switzerland.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT One of the current research trends in Enterprise Resource Planning (ERP) involves examining the critical factors for its successful implementation. However, such research is limited to system implementation, not focusing on the flexibility of ERP to respond to changes in business. Therefore, this study explores a combination system, made up of an ERP and informality, intended to provide organisations with efficient and flexible performance simultaneously. In addition, this research analyses the benefits and challenges of using the system. The research was based on socio-technical system (STS) theory which contains two dimensions: 1) a technical dimension which evaluates the performance of the system; and 2) a social dimension which examines the impact of the system on an organisation. A mixed method approach has been followed in this research. The qualitative part aims to understand the constraints of using a single ERP system, and to define a new system corresponding to these problems. To achieve this goal, four Chinese companies operating in different industries were studied, all of which faced challenges in using an ERP system due to complexity and uncertainty in their business environments. The quantitative part contains a discrete-event simulation study that is intended to examine the impact of operational performance when a company implements the hybrid system in a real-life situation. Moreover, this research conducts a further qualitative case study, the better to understand the influence of the system in an organisation. The empirical aspect of the study reveals that an ERP with pre-determined business activities cannot react promptly to unanticipated changes in a business. Incorporating informality into an ERP can react to different situations by using different procedures that are based on the practical knowledge of frontline employees. Furthermore, the simulation study shows that the combination system can achieve a balance between efficiency and flexibility. Unlike existing research, which emphasises a continuous improvement in the IT functions of an enterprise system, this research contributes to providing a definition of a new system in theory, which has mixed performance and contains both the formal practices embedded in an ERP and informal activities based on human knowledge. It supports both cost-efficiency in executing business transactions and flexibility in coping with business uncertainty.This research also indicates risks of using the system, such as using an ERP with limited functions; a high cost for performing informally; and a low system acceptance, owing to a shift in organisational culture. With respect to practical contribution, this research suggests that companies can choose the most suitable enterprise system approach in accordance with their operational strategies. The combination system can be implemented in a company that needs to operate a medium amount of volume and variety. By contrast, the traditional ERP system is better suited in a company that operates a high-level volume market, while an informal system is more suitable for a firm with a requirement for a high level of variety.
Resumo:
Analysing the molecular polymorphism and interactions of DNA, RNA and proteins is of fundamental importance in biology. Predicting functions of polymorphic molecules is important in order to design more effective medicines. Analysing major histocompatibility complex (MHC) polymorphism is important for mate choice, epitope-based vaccine design and transplantation rejection etc. Most of the existing exploratory approaches cannot analyse these datasets because of the large number of molecules with a high number of descriptors per molecule. This thesis develops novel methods for data projection in order to explore high dimensional biological dataset by visualising them in a low-dimensional space. With increasing dimensionality, some existing data visualisation methods such as generative topographic mapping (GTM) become computationally intractable. We propose variants of these methods, where we use log-transformations at certain steps of expectation maximisation (EM) based parameter learning process, to make them tractable for high-dimensional datasets. We demonstrate these proposed variants both for synthetic and electrostatic potential dataset of MHC class-I. We also propose to extend a latent trait model (LTM), suitable for visualising high dimensional discrete data, to simultaneously estimate feature saliency as an integrated part of the parameter learning process of a visualisation model. This LTM variant not only gives better visualisation by modifying the project map based on feature relevance, but also helps users to assess the significance of each feature. Another problem which is not addressed much in the literature is the visualisation of mixed-type data. We propose to combine GTM and LTM in a principled way where appropriate noise models are used for each type of data in order to visualise mixed-type data in a single plot. We call this model a generalised GTM (GGTM). We also propose to extend GGTM model to estimate feature saliencies while training a visualisation model and this is called GGTM with feature saliency (GGTM-FS). We demonstrate effectiveness of these proposed models both for synthetic and real datasets. We evaluate visualisation quality using quality metrics such as distance distortion measure and rank based measures: trustworthiness, continuity, mean relative rank errors with respect to data space and latent space. In cases where the labels are known we also use quality metrics of KL divergence and nearest neighbour classifications error in order to determine the separation between classes. We demonstrate the efficacy of these proposed models both for synthetic and real biological datasets with a main focus on the MHC class-I dataset.
Resumo:
The paper has been presented at the 12th International Conference on Applications of Computer Algebra, Varna, Bulgaria, June, 2006
Resumo:
In this paper an alternative characterization of the class of functions called k -uniformly convex is found. Various relations concerning connections with other classes of univalent functions are given. Moreover a new class of univalent functions, analogous to the ’Mocanu class’ of functions, is introduced. Some results concerning this class are derived.
Resumo:
Let a compact Hausdorff space X contain a non-empty perfect subset. If α < β and β is a countable ordinal, then the Banach space Bα (X) of all bounded real-valued functions of Baire class α on X is a proper subspace of the Banach space Bβ (X). In this paper it is shown that: 1. Bα (X) has a representation as C(bα X), where bα X is a compactification of the space P X – the underlying set of X in the Baire topology generated by the Gδ -sets in X. 2. If 1 ≤ α < β ≤ Ω, where Ω is the first uncountable ordinal number, then Bα (X) is uncomplemented as a closed subspace of Bβ (X). These assertions for X = [0, 1] were proved by W. G. Bade [4] and in the case when X contains an uncountable compact metrizable space – by F.K.Dashiell [9]. Our argumentation is one non-metrizable modification of both Bade’s and Dashiell’s methods.
Resumo:
∗ Partially supported by grant No. 433/94 NSF of the Ministry of Education and Science of the Republic of Bulgaria 1991 Mathematics Subject Classification:30C45
Resumo:
A Superadditive Bisexual Galton-Watson Branching Process is considered and the total number of mating units, females and males, until the n-th generation, are studied. In particular some results about the stochastic monotony, probability generating functions and moments are obtained. Finally, the limit behaviour of those variables suitably normed is investigated.