47 resultados para Data compression (Electronic computers)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

New ways of combining observations with numerical models are discussed in which the size of the state space can be very large, and the model can be highly nonlinear. Also the observations of the system can be related to the model variables in highly nonlinear ways, making this data-assimilation (or inverse) problem highly nonlinear. First we discuss the connection between data assimilation and inverse problems, including regularization. We explore the choice of proposal density in a Particle Filter and show how the ’curse of dimensionality’ might be beaten. In the standard Particle Filter ensembles of model runs are propagated forward in time until observations are encountered, rendering it a pure Monte-Carlo method. In large-dimensional systems this is very inefficient and very large numbers of model runs are needed to solve the data-assimilation problem realistically. In our approach we steer all model runs towards the observations resulting in a much more efficient method. By further ’ensuring almost equal weight’ we avoid performing model runs that are useless in the end. Results are shown for the 40 and 1000 dimensional Lorenz 1995 model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new electronic software distribution (ESD) life cycle analysis (LCA)methodology and model structure were constructed to calculate energy consumption and greenhouse gas (GHG) emissions. In order to counteract the use of high level, top-down modeling efforts, and to increase result accuracy, a focus upon device details and data routes was taken. In order to compare ESD to a relevant physical distribution alternative,physical model boundaries and variables were described. The methodology was compiled from the analysis and operational data of a major online store which provides ESD and physical distribution options. The ESD method included the calculation of power consumption of data center server and networking devices. An in-depth method to calculate server efficiency and utilization was also included to account for virtualization and server efficiency features. Internet transfer power consumption was analyzed taking into account the number of data hops and networking devices used. The power consumed by online browsing and downloading was also factored into the model. The embedded CO2e of server and networking devices was proportioned to each ESD process. Three U.K.-based ESD scenarios were analyzed using the model which revealed potential CO2e savings of 83% when ESD was used over physical distribution. Results also highlighted the importance of server efficiency and utilization methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is widely recognized that small businesses with less than 50 employees make significant contributions to the prosperity of local, regional, and national economies. They are a major source of job creation and a driving force of economic growth for developed countries like the USA (Headd, 2005; SBA, 2005), the UK (Dixon, Thompson, & McAllister, 2002; SBS, 2005), Europe (European Commission, 2003), and developing countries such as China (Bo, 2005). The economic potential is further strengthened when firms collaborate with each other; for example, formation of a supply chain, strategic alliances, or sharing of information and resources (Horvath, 2001; O’Donnell, Cilmore, Cummins, & Carson, 2001; MacGregor, 2004; Todeva & Knoke, 2005). Owing to heterogeneous aspects of small businesses, such as firm size and business sector, a single e-business solution is unlikely to be suitable for all firms (Dixon et al., 2002; Taylor & Murphy, 2004a); however, collaboration requires individual firms to adopt standardized, simplified solutions based on open architectures and data design (Horvath, 2001). The purpose of this article is to propose a conceptual e-business framework and a generic e-catalogue, which enables small businesses to collaborate through the creation of an e-marketplace. To assist with the task, analysis of data from 6,000 small businesses situated within a locality of Greater Manchester, England within the context of an e-business portal is incorporated within this study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In numerical weather prediction (NWP) data assimilation (DA) methods are used to combine available observations with numerical model estimates. This is done by minimising measures of error on both observations and model estimates with more weight given to data that can be more trusted. For any DA method an estimate of the initial forecast error covariance matrix is required. For convective scale data assimilation, however, the properties of the error covariances are not well understood. An effective way to investigate covariance properties in the presence of convection is to use an ensemble-based method for which an estimate of the error covariance is readily available at each time step. In this work, we investigate the performance of the ensemble square root filter (EnSRF) in the presence of cloud growth applied to an idealised 1D convective column model of the atmosphere. We show that the EnSRF performs well in capturing cloud growth, but the ensemble does not cope well with discontinuities introduced into the system by parameterised rain. The state estimates lose accuracy, and more importantly the ensemble is unable to capture the spread (variance) of the estimates correctly. We also find, counter-intuitively, that by reducing the spatial frequency of observations and/or the accuracy of the observations, the ensemble is able to capture the states and their variability successfully across all regimes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optimal state estimation from given observations of a dynamical system by data assimilation is generally an ill-posed inverse problem. In order to solve the problem, a standard Tikhonov, or L2, regularization is used, based on certain statistical assumptions on the errors in the data. The regularization term constrains the estimate of the state to remain close to a prior estimate. In the presence of model error, this approach does not capture the initial state of the system accurately, as the initial state estimate is derived by minimizing the average error between the model predictions and the observations over a time window. Here we examine an alternative L1 regularization technique that has proved valuable in image processing. We show that for examples of flow with sharp fronts and shocks, the L1 regularization technique performs more accurately than standard L2 regularization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Naphthalene and anthracene transition metalates are potent reagents, but their electronic structures have remained poorly explored. A study of four Cp*-substituted iron complexes (Cp* = pentamethylcyclopentadienyl) now gives rare insight into the bonding features of such species. The highly oxygen- and water-sensitive compounds [K(18-crown- 6){Cp*Fe(η4-C10H8)}] (K1), [K(18-crown-6){Cp*Fe(η4-C14H10)}] (K2), [Cp*Fe(η4-C10H8)] (1), and [Cp*Fe(η4-C14H10)] (2) were synthesized and characterized by NMR, UV−vis, and 57Fe Mössbauer spectroscopy. The paramagnetic complexes 1 and 2 were additionally characterized by electron paramagnetic resonance (EPR) spectroscopy and magnetic susceptibility measurements. The molecular structures of complexes K1, K2, and 2 were determined by single-crystal X-ray crystallography. Cyclic voltammetry of 1 and 2 and spectroelectrochemical experiments revealed the redox properties of these complexes, which are reversibly reduced to the monoanions [Cp*Fe(η4-C10H8)]− (1−) and [Cp*Fe(η4-C14H10)]− (2−) and reversibly oxidized to the cations [Cp*Fe(η6-C10H8)]+ (1+) and [Cp*Fe(η6-C14H10)]+ (2+). Reduced orbital charges and spin densities of the naphthalene complexes 1−/0/+ and the anthracene derivatives 2−/0/+ were obtained by density functional theory (DFT) methods. Analysis of these data suggests that the electronic structures of the anions 1− and 2− are best represented by low-spin FeII ions coordinated by anionic Cp* and dianionic naphthalene and anthracene ligands. The electronic structures of the neutral complexes 1 and 2 may be described by a superposition of two resonance configurations which, on the one hand, involve a low-spin FeI ion coordinated by the neutral naphthalene or anthracene ligand L, and, on the other hand, a low-spin FeII ion coordinated to a ligand radical L•−. Our study thus reveals the redox noninnocent character of the naphthalene and anthracene ligands, which effectively stabilize the iron atoms in a low formal, but significantly higher spectroscopic oxidation state.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate modeling is a complex process, requiring accurate and complete metadata in order to identify, assess and use climate data stored in digital repositories. The preservation of such data is increasingly important given the development of ever-increasingly complex models to predict the effects of global climate change. The EU METAFOR project has developed a Common Information Model (CIM) to describe climate data and the models and modelling environments that produce this data. There is a wide degree of variability between different climate models and modelling groups. To accommodate this, the CIM has been designed to be highly generic and flexible, with extensibility built in. METAFOR describes the climate modelling process simply as "an activity undertaken using software on computers to produce data." This process has been described as separate UML packages (and, ultimately, XML schemas). This fairly generic structure canbe paired with more specific "controlled vocabularies" in order to restrict the range of valid CIM instances. The CIM will aid digital preservation of climate models as it will provide an accepted standard structure for the model metadata. Tools to write and manage CIM instances, and to allow convenient and powerful searches of CIM databases,. Are also under development. Community buy-in of the CIM has been achieved through a continual process of consultation with the climate modelling community, and through the METAFOR team’s development of a questionnaire that will be used to collect the metadata for the Intergovernmental Panel on Climate Change’s (IPCC) Coupled Model Intercomparison Project Phase 5 (CMIP5) model runs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pocket Data Mining (PDM) describes the full process of analysing data streams in mobile ad hoc distributed environments. Advances in mobile devices like smart phones and tablet computers have made it possible for a wide range of applications to run in such an environment. In this paper, we propose the adoption of data stream classification techniques for PDM. Evident by a thorough experimental study, it has been proved that running heterogeneous/different, or homogeneous/similar data stream classification techniques over vertically partitioned data (data partitioned according to the feature space) results in comparable performance to batch and centralised learning techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of pulse compression techniques to improve the sensitivity of meteorological radars has become increasingly common in recent years. An unavoidable side-effect of such techniques is the formation of ‘range sidelobes’ which lead to spreading of information across several range gates. These artefacts are particularly troublesome in regions where there is a sharp gradient in the power backscattered to the antenna as a function of range. In this article we present a simple method for identifying and correcting range sidelobe artefacts. We make use of the fact that meteorological targets produce an echo which fluctuates at random, and that this echo, like a fingerprint, is unique to each range gate. By cross-correlating the echo time series from pairs of gates therefore we can identify whether information from one gate has spread into another, and hence flag regions of contamination. In addition we show that the correlation coefficients contain quantitative information about the fraction of power leaked from one range gate to another, and we propose a simple algorithm to correct the corrupted reflectivity profile.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dissymmetrical naphthalene-bridged complexes [Cp′Fe(μ-C10H8)FeCp*] (3; Cp* = η5-C5Me5, Cp′ = η5-C5H2-1,2,4-tBu3) and [Cp′Fe(μ-C10H8)RuCp*] (4) were synthesized via a one-pot procedure from FeCl2(thf)1.5, Cp′K, KC10H8, and [Cp* FeCl(tmeda)] (tmeda = N,N,N′,N′- tetramethylethylenediamine) or [Cp*RuCl]4, respectively. The symmetrically substituted iron ruthenium complex [Cp*Fe(μ-C10H8)RuCp*] (5) bearing two Cp* ligands was prepared as a reference compound. Compounds 3−5 are diamagnetic and display similar molecular structures, where the metal atoms are coordinated to opposite sides of the bridging naphthalene molecule. Cyclic voltammetry and UV/vis spectroelectrochemistry studies revealed that neutral 3−5 can be oxidized to monocations 3+−5+ and dications 32+−52+. The chemical oxidation of 3 and 4 with [Cp2Fe]PF6 afforded the paramagnetic hexafluorophosphate salts [Cp′Fe(μ-C10H8)FeCp*]PF6 ([3]PF6) and [Cp′Fe(μ-C10H8)RuCp*]PF6 ([4]PF6), which were characterized by various spectroscopic techniques, including EPR and 57Fe Mössbauer spectroscopy. The molecular structure of [4]PF6 was determined by X-ray crystallography. DFT calculations support the structural and spectroscopic data and determine the compositions of frontier molecular orbitals in the investigated complexes. The effects of substituting Cp* with Cp′ and Fe with Ru on the electronic structures and the structural and spectroscopic properties are analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A potential problem with Ensemble Kalman Filter is the implicit Gaussian assumption at analysis times. Here we explore the performance of a recently proposed fully nonlinear particle filter on a high-dimensional but simplified ocean model, in which the Gaussian assumption is not made. The model simulates the evolution of the vorticity field in time, described by the barotropic vorticity equation, in a highly nonlinear flow regime. While common knowledge is that particle filters are inefficient and need large numbers of model runs to avoid degeneracy, the newly developed particle filter needs only of the order of 10-100 particles on large scale problems. The crucial new ingredient is that the proposal density cannot only be used to ensure all particles end up in high-probability regions of state space as defined by the observations, but also to ensure that most of the particles have similar weights. Using identical twin experiments we found that the ensemble mean follows the truth reliably, and the difference from the truth is captured by the ensemble spread. A rank histogram is used to show that the truth run is indistinguishable from any of the particles, showing statistical consistency of the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Owing to continuous advances in the computational power of handheld devices like smartphones and tablet computers, it has become possible to perform Big Data operations including modern data mining processes onboard these small devices. A decade of research has proved the feasibility of what has been termed as Mobile Data Mining, with a focus on one mobile device running data mining processes. However, it is not before 2010 until the authors of this book initiated the Pocket Data Mining (PDM) project exploiting the seamless communication among handheld devices performing data analysis tasks that were infeasible until recently. PDM is the process of collaboratively extracting knowledge from distributed data streams in a mobile computing environment. This book provides the reader with an in-depth treatment on this emerging area of research. Details of techniques used and thorough experimental studies are given. More importantly and exclusive to this book, the authors provide detailed practical guide on the deployment of PDM in the mobile environment. An important extension to the basic implementation of PDM dealing with concept drift is also reported. In the era of Big Data, potential applications of paramount importance offered by PDM in a variety of domains including security, business and telemedicine are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Smart healthcare is a complex domain for systems integration due to human and technical factors and heterogeneous data sources involved. As a part of smart city, it is such a complex area where clinical functions require smartness of multi-systems collaborations for effective communications among departments, and radiology is one of the areas highly relies on intelligent information integration and communication. Therefore, it faces many challenges regarding integration and its interoperability such as information collision, heterogeneous data sources, policy obstacles, and procedure mismanagement. The purpose of this study is to conduct an analysis of data, semantic, and pragmatic interoperability of systems integration in radiology department, and to develop a pragmatic interoperability framework for guiding the integration. We select an on-going project at a local hospital for undertaking our case study. The project is to achieve data sharing and interoperability among Radiology Information Systems (RIS), Electronic Patient Record (EPR), and Picture Archiving and Communication Systems (PACS). Qualitative data collection and analysis methods are used. The data sources consisted of documentation including publications and internal working papers, one year of non-participant observations and 37 interviews with radiologists, clinicians, directors of IT services, referring clinicians, radiographers, receptionists and secretary. We identified four primary phases of data analysis process for the case study: requirements and barriers identification, integration approach, interoperability measurements, and knowledge foundations. Each phase is discussed and supported by qualitative data. Through the analysis we also develop a pragmatic interoperability framework that summaries the empirical findings and proposes recommendations for guiding the integration in the radiology context.