742 resultados para implementations
Conditioning of incremental variational data assimilation, with application to the Met Office system
Resumo:
Implementations of incremental variational data assimilation require the iterative minimization of a series of linear least-squares cost functions. The accuracy and speed with which these linear minimization problems can be solved is determined by the condition number of the Hessian of the problem. In this study, we examine how different components of the assimilation system influence this condition number. Theoretical bounds on the condition number for a single parameter system are presented and used to predict how the condition number is affected by the observation distribution and accuracy and by the specified lengthscales in the background error covariance matrix. The theoretical results are verified in the Met Office variational data assimilation system, using both pseudo-observations and real data.
Resumo:
Proposed is a unique cell histogram architecture which will process k data items in parallel to compute 2q histogram bins per time step. An array of m/2q cells computes an m-bin histogram with a speed-up factor of k; k ⩾ 2 makes it faster than current dual-ported memory implementations. Furthermore, simple mechanisms for conflict-free storing of the histogram bins into an external memory array are discussed.
Resumo:
Pervasive computing is a continually, and rapidly, growing field, although still remains in relative infancy. The possible applications for the technology are numerous, and stand to fundamentally change the way users interact with technology. However, alongside these are equally numerous potential undesirable effects and risks. The lack of empirical naturalistic data in the real world makes studying the true impacts of this technology difficult. This paper describes how two independent research projects shared such valuable empirical data on the relationship between pervasive technologies and users. Each project had different aims and adopted different methods, but successfully used the same data and arrived at the same conclusions. This paper demonstrates the benefit of sharing research data in multidisciplinary pervasive computing research where real world implementations are not widely available.
Resumo:
Clinical pathway is an approach to standardise care processes to support the implementations of clinical guidelines and protocols. It is designed to support the management of treatment processes including clinical and non-clinical activities, resources and also financial aspects. It provides detailed guidance for each stage in the management of a patient with the aim of improving the continuity and coordination of care across different disciplines and sectors. However, in the practical treatment process, the lack of knowledge sharing and information accuracy of paper-based clinical pathways burden health-care staff with a large amount of paper work. This will often result in medical errors, inefficient treatment process and thus poor quality medical services. This paper first presents a theoretical underpinning and a co-design research methodology for integrated pathway management by drawing input from organisational semiotics. An approach to integrated clinical pathway management is then proposed, which aims to embed pathway knowledge into treatment processes and existing hospital information systems. The capability of this approach has been demonstrated through the case study in one of the largest hospitals in China. The outcome reveals that medical quality can be improved significantly by the classified clinical pathway knowledge and seamless integration with hospital information systems.
Resumo:
Hybrid multiprocessor architectures which combine re-configurable computing and multiprocessors on a chip are being proposed to transcend the performance of standard multi-core parallel systems. Both fine-grained and coarse-grained parallel algorithm implementations are feasible in such hybrid frameworks. A compositional strategy for designing fine-grained multi-phase regular processor arrays to target hybrid architectures is presented in this paper. The method is based on deriving component designs using classical regular array techniques and composing the components into a unified global design. Effective designs with phase-changes and data routing at run-time are characteristics of these designs. In order to describe the data transfer between phases, the concept of communication domain is introduced so that the producer–consumer relationship arising from multi-phase computation can be treated in a unified way as a data routing phase. This technique is applied to derive new designs of multi-phase regular arrays with different dataflow between phases of computation.
Resumo:
Two recent works have adapted the Kalman–Bucy filter into an ensemble setting. In the first formulation, the ensemble of perturbations is updated by the solution of an ordinary differential equation (ODE) in pseudo-time, while the mean is updated as in the standard Kalman filter. In the second formulation, the full ensemble is updated in the analysis step as the solution of single set of ODEs in pseudo-time. Neither requires matrix inversions except for the frequently diagonal observation error covariance. We analyse the behaviour of the ODEs involved in these formulations. We demonstrate that they stiffen for large magnitudes of the ratio of background error to observational error variance, and that using the integration scheme proposed in both formulations can lead to failure. A numerical integration scheme that is both stable and is not computationally expensive is proposed. We develop transform-based alternatives for these Bucy-type approaches so that the integrations are computed in ensemble space where the variables are weights (of dimension equal to the ensemble size) rather than model variables. Finally, the performance of our ensemble transform Kalman–Bucy implementations is evaluated using three models: the 3-variable Lorenz 1963 model, the 40-variable Lorenz 1996 model, and a medium complexity atmospheric general circulation model known as SPEEDY. The results from all three models are encouraging and warrant further exploration of these assimilation techniques.
Resumo:
The financial crisis of 2007–2009 and the resultant pressures exerted on policymakers to prevent future crises have precipitated coordinated regulatory responses globally. A key focus of the new wave of regulation is to ensure the removal of practices now deemed problematic with new controls for conducting transactions and maintaining holdings. There is increasing pressure on organizations to retire manual processes and adopt core systems, such as Investment Management Systems (IMS). These systems facilitate trading and ensure transactions are compliant by transcribing regulatory requirements into automated rules and applying them to trades. The motivation of this study is to explore the extent to which such systems may enable the alteration of previously embedded practices. We researched implementations of an IMS at eight global financial organizations and found that overall the IMS encourages responsible trading through surveillance, monitoring and the automation of regulatory rules and that such systems are likely to become further embedded within financial organizations. We found evidence that some older practices persisted. Our study suggests that the institutionalization of technology-induced compliant behaviour is still uncertain.
Resumo:
The financial crisis of 2007-2009 and the subsequent reaction of the G20 have created a new global regulatory landscape. Within the EU, change of regulatory institutions is ongoing. The research objective of this study is to understand how institutional changes to the EU regulatory landscape may affect corresponding institutionalized operational practices within financial organizations and to understand the role of agency within this process. Our motivation is to provide insight into these changes from an operational management perspective, as well as to test Thelen and Mahoney?s (2010) modes of institutional change. Consequently, the study researched implementations of an Investment Management System with a rules-based compliance module within financial organizations. The research consulted compliance and risk managers, as well as systems experts. The study suggests that prescriptive regulations are likely to create isomorphic configurations of rules-based compliance systems, which consequently will enable the institutionalization of associated compliance practices. The study reveals the ability of some agents within financial organizations to control the impact of regulatory institutions, not directly, but through the systems and processes they adopt to meet requirements. Furthermore, the research highlights the boundaries and relationships between each mode of change as future avenues of research.
Resumo:
Education, especially higher education, is considered vital for maintaining national and individual competitiveness in the global knowledge economy. Following the introduction of its “Free Education Policy” as early as 1947, Sri Lanka is now the best performer in basic education in the South Asian region, with a remarkable record in terms of high literacy rates and the achievement of universal primary education. However, access to tertiary education is a bottleneck, due to an acute shortage of university places. In an attempt to address this problem, the government of Sri Lanka has invested heavily in information and communications technologies (ICTs) for distance education. Although this has resulted in some improvement, the authors of this article identify several barriers which are still impeding successful participation for the majority of Sri Lankans wanting to study at tertiary level. These impediments include the lack of infrastructure/resources, low English language proficiency, weak digital literacy, poor quality of materials and insufficient provision of student support. In the hope that future implementations of ICT-enabled education programmes can avoid repeating the mistakes identified by their research in this Sri Lankan case, the authors conclude their paper with a list of suggested policy options.
Resumo:
Paraconsistent logics are non-classical logics which allow non-trivial and consistent reasoning about inconsistent axioms. They have been pro- posed as a formal basis for handling inconsistent data, as commonly arise in human enterprises, and as methods for fuzzy reasoning, with applica- tions in Artificial Intelligence and the control of complex systems. Formalisations of paraconsistent logics usually require heroic mathe- matical efforts to provide a consistent axiomatisation of an inconsistent system. Here we use transreal arithmetic, which is known to be consis- tent, to arithmetise a paraconsistent logic. This is theoretically simple and should lead to efficient computer implementations. We introduce the metalogical principle of monotonicity which is a very simple way of making logics paraconsistent. Our logic has dialetheaic truth values which are both False and True. It allows contradictory propositions, allows variable contradictions, but blocks literal contradictions. Thus literal reasoning, in this logic, forms an on-the- y, syntactic partition of the propositions into internally consistent sets. We show how the set of all paraconsistent, possible worlds can be represented in a transreal space. During the development of our logic we discuss how other paraconsistent logics could be arithmetised in transreal arithmetic.
Resumo:
Purpose The research objective of this study is to understand how institutional changes to the EU regulatory landscape may affect corresponding institutionalized operational practices within financial organizations. Design/methodology/approach The study adopts an Investment Management System as its case and investigates different implementations of this system within eight financial organizations, predominantly focused on investment banking and asset management activities within capital markets. At the systems vendor site, senior systems consultants and client relationship managers were interviewed. Within the financial organizations, compliance, risk and systems experts were interviewed. Findings The study empirically tests modes of institutional change. Displacement and Layering were found to be the most prevalent modes. However, the study highlights how the outcomes of Displacement and Drift may be similar in effect as both modes may cause compliance gaps. The research highlights how changes in regulations may create gaps in systems and processes which, in the short term, need to be plugged by manual processes. Practical implications Vendors abilities to manage institutional change caused by Drift, Displacement, Layering and Conversion and their ability to efficiently and quickly translate institutional variables into structured systems has the power to ease the pain and cost of compliance as well as reducing the risk of breeches by reducing the need for interim manual systems. Originality/value The study makes a contribution by applying recent theoretical concepts of institutional change to the topic of regulatory change uses this analysis to provide insight into the effects of this new environment
Resumo:
With a unique cultural background and fast economic development, China’s adoption of corporate social responsibility (CSR) has become the center of discussion worldwide, and its successful implementation will have great significance for global sustainability. This paper aims to explore how CSR has given way to economic growth in China since the start of economic transition and its cultural, historical and political background, and how this has affected or been affected by the economic performance of firms. Thus, the recent calls for China to adopt CSR in its industries follow a period where the country arguably had one of the strongest implementations of CSR approaches in the world. This transition is considered in the context of a case study of a Chinese state-owned enterprise (SOE) and a group of small private firms in the same industrial sector in Zhengzhou City, Henan Province over a time span of eight years. While the CSR of the SOE has been steadily decreasing along with the change of ownership structure, its economic performance did not improve as expected. On the other hand, with a steady improvement in economic performance, the small private firms are showing a great reluctance to engage in CSR. The results indicate that implementation of CSR in China needs both the manager’s ethical awareness and the change of institutional framework. The results also raise the question as to whether CSR is a universal concept with a desired means of implementation across the developed and developing world.
Resumo:
The core processing step of the noise reduction median filter technique is to find the median within a window of integers. A four-step procedure method to compute the running median of the last N W-bit stream of integers showing area and time benefits is proposed. The method slices integers into groups of B-bit using a pipeline of W/B blocks. From the method, an architecture is developed giving a designer the flexibility to exchange area gains for faster frequency of operation, or vice versa, by adjusting N, W and B parameter values. Gains in area of around 40%, or in frequency of operation of around 20%, are clearly observed by FPGA circuit implementations compared to latest methods in the literature.
Resumo:
This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.
Resumo:
Geospatial information of many kinds, from topographic maps to scientific data, is increasingly being made available through web mapping services. These allow georeferenced map images to be served from data stores and displayed in websites and geographic information systems, where they can be integrated with other geographic information. The Open Geospatial Consortium’s Web Map Service (WMS) standard has been widely adopted in diverse communities for sharing data in this way. However, current services typically provide little or no information about the quality or accuracy of the data they serve. In this paper we will describe the design and implementation of a new “quality-enabled” profile of WMS, which we call “WMS-Q”. This describes how information about data quality can be transmitted to the user through WMS. Such information can exist at many levels, from entire datasets to individual measurements, and includes the many different ways in which data uncertainty can be expressed. We also describe proposed extensions to the Symbology Encoding specification, which include provision for visualizing uncertainty in raster data in a number of different ways, including contours, shading and bivariate colour maps. We shall also describe new open-source implementations of the new specifications, which include both clients and servers.