920 resultados para Distributed Control Problems
Resumo:
In this paper we explore classification techniques for ill-posed problems. Two classes are linearly separable in some Hilbert space X if they can be separated by a hyperplane. We investigate stable separability, i.e. the case where we have a positive distance between two separating hyperplanes. When the data in the space Y is generated by a compact operator A applied to the system states ∈ X, we will show that in general we do not obtain stable separability in Y even if the problem in X is stably separable. In particular, we show this for the case where a nonlinear classification is generated from a non-convergent family of linear classes in X. We apply our results to the problem of quality control of fuel cells where we classify fuel cells according to their efficiency. We can potentially classify a fuel cell using either some external measured magnetic field or some internal current. However we cannot measure the current directly since we cannot access the fuel cell in operation. The first possibility is to apply discrimination techniques directly to the measured magnetic fields. The second approach first reconstructs currents and then carries out the classification on the current distributions. We show that both approaches need regularization and that the regularized classifications are not equivalent in general. Finally, we investigate a widely used linear classification algorithm Fisher's linear discriminant with respect to its ill-posedness when applied to data generated via a compact integral operator. We show that the method cannot stay stable when the number of measurement points becomes large.
Resumo:
The P-found protein folding and unfolding simulation repository is designed to allow scientists to perform data mining and other analyses across large, distributed simulation data sets. There are two storage components in P-found: a primary repository of simulation data that is used to populate the second component, and a data warehouse that contains important molecular properties. These properties may be used for data mining studies. Here we demonstrate how grid technologies can support multiple, distributed P-found installations. In particular, we look at two aspects: firstly, how grid data management technologies can be used to access the distributed data warehouses; and secondly, how the grid can be used to transfer analysis programs to the primary repositories — this is an important and challenging aspect of P-found, due to the large data volumes involved and the desire of scientists to maintain control of their own data. The grid technologies we are developing with the P-found system will allow new large data sets of protein folding simulations to be accessed and analysed in novel ways, with significant potential for enabling scientific discovery.
Resumo:
We study a two-way relay network (TWRN), where distributed space-time codes are constructed across multiple relay terminals in an amplify-and-forward mode. Each relay transmits a scaled linear combination of its received symbols and their conjugates,with the scaling factor chosen based on automatic gain control. We consider equal power allocation (EPA) across the relays, as well as the optimal power allocation (OPA) strategy given access to instantaneous channel state information (CSI). For EPA, we derive an upper bound on the pairwise-error-probability (PEP), from which we prove that full diversity is achieved in TWRNs. This result is in contrast to one-way relay networks, in which case a maximum diversity order of only unity can be obtained. When instantaneous CSI is available at the relays, we show that the OPA which minimizes the conditional PEP of the worse link can be cast as a generalized linear fractional program, which can be solved efficiently using the Dinkelback-type procedure.We also prove that, if the sum-power of the relay terminals is constrained, then the OPA will activate at most two relays.
Resumo:
With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.
Resumo:
We derive energy-norm a posteriori error bounds, using gradient recovery (ZZ) estimators to control the spatial error, for fully discrete schemes for the linear heat equation. This appears to be the �rst completely rigorous derivation of ZZ estimators for fully discrete schemes for evolution problems, without any restrictive assumption on the timestep size. An essential tool for the analysis is the elliptic reconstruction technique.Our theoretical results are backed with extensive numerical experimentation aimed at (a) testing the practical sharpness and asymptotic behaviour of the error estimator against the error, and (b) deriving an adaptive method based on our estimators. An extra novelty provided is an implementation of a coarsening error "preindicator", with a complete implementation guide in ALBERTA in the appendix.
Resumo:
Climate data are used in a number of applications including climate risk management and adaptation to climate change. However, the availability of climate data, particularly throughout rural Africa, is very limited. Available weather stations are unevenly distributed and mainly located along main roads in cities and towns. This imposes severe limitations to the availability of climate information and services for the rural community where, arguably, these services are needed most. Weather station data also suffer from gaps in the time series. Satellite proxies, particularly satellite rainfall estimate, have been used as alternatives because of their availability even over remote parts of the world. However, satellite rainfall estimates also suffer from a number of critical shortcomings that include heterogeneous time series, short time period of observation, and poor accuracy particularly at higher temporal and spatial resolutions. An attempt is made here to alleviate these problems by combining station measurements with the complete spatial coverage of satellite rainfall estimates. Rain gauge observations are merged with a locally calibrated version of the TAMSAT satellite rainfall estimates to produce over 30-years (1983-todate) of rainfall estimates over Ethiopia at a spatial resolution of 10 km and a ten-daily time scale. This involves quality control of rain gauge data, generating locally calibrated version of the TAMSAT rainfall estimates, and combining these with rain gauge observations from national station network. The infrared-only satellite rainfall estimates produced using a relatively simple TAMSAT algorithm performed as good as or even better than other satellite rainfall products that use passive microwave inputs and more sophisticated algorithms. There is no substantial difference between the gridded-gauge and combined gauge-satellite products over the test area in Ethiopia having a dense station network; however, the combined product exhibits better quality over parts of the country where stations are sparsely distributed.
Resumo:
Many studies have shown that farmers in developing countries often overuse pesticides and do not adopt safety practices. Policies and interventions to promote a safer use of pesticides are often based on a limited understanding of the farmers’ own perspective of pesticide use. This often results in ineffective policies and the persistence of significant pesticide-related health and environmental problems, especially in developing countries. This chapter explores potentials and limitations of different approaches to study pesticide use in agriculture from the farmers’ perspective. In contrast to the reductionist and mono-disciplinary approaches often adopted, this chapter calls for integrative methodological approaches to provide a realistic and thorough understanding of the farmers’ perspective on pesticide use and illustrates the added value of such an approach with three case studies of pesticide use in Iran, India, and Colombia.
Resumo:
The investigation of bilingualism and cognition has been enriched by recent developments in functional magnetic resonance imaging (fMRI). Extending how bilingual experience shapes cognition, this review examines recent fMRI studies adopting executive control tasks with minimal or no linguistic demands. Across a range of studies with divergent ages and language pairs spoken by bilinguals, brain regions supporting executive control significantly overlap with brain regions recruited for language control (Abutalebi & Green, this issue). Furthermore, limited but emerging studies on resting-state networks are addressed, which suggest more coherent spatially distributed functional connectivity in bilinguals. Given the dynamic nature of bilingual experience, it is essential to consider both task-related functional networks (externally-driven engagement), and resting-state networks, such as default mode network (internal control). Both types of networks are important elements of bilingual language control, which relies on domain-general executive control.
Resumo:
Usually, a Petri net is applied as an RFID model tool. This paper, otherwise, presents another approach to the Petri net concerning RFID systems. This approach, called elementary Petri net inside an RFID distributed database, or PNRD, is the first step to improve RFID and control systems integration, based on a formal data structure to identify and update the product state in real-time process execution, allowing automatic discovery of unexpected events during tag data capture. There are two main features in this approach: to use RFID tags as the object process expected database and last product state identification; and to apply Petri net analysis to automatically update the last product state registry during reader data capture. RFID reader data capture can be viewed, in Petri nets, as a direct analysis of locality for a specific transition that holds in a specific workflow. Following this direction, RFID readers storage Petri net control vector list related to each tag id is expected to be perceived. This paper presents PNRD cornerstones and a PNRD implementation example in software called DEMIS Distributed Environment in Manufacturing Information Systems.
Resumo:
The aim of this thesis is to describe and analyze the geographical distribution of everyday criminality in the town of Borlänge during the year 2002 and to analyze which measures to be taken in the physical social planning to decrease this everyday criminality there. The term everyday criminality is here to be understood as those categories of crime that appear most frequently in the records of reports to the police every year. Here two kinds of crime have been in focus, thefts from cars and office burglary.In fulfilling this aim two main questions have been answered. The first one is how the everyday criminality was distributed geographically in the town of Borlänge during the year 2002. The second one is which measures to be taken in the physical social planning to decrease this everyday criminality in the town of Borlänge.In order to answer the first question a spatial autocorrelation analysis, Local Moran LISA has been used. This method is based on the measurement Moran´s I and shows the spatial autocorrelation for every single location. To answer the second question three different theories of crime prevention through environmental design have been studied and applied in the analysis. These are Jane Jacobs’ ideas about ”the living city”, Oscar Newman´s ideas about ”defensible space” and Ronald V. Clarke´s theories about crime prevention.The major conclusions that can be drawn from this thesis are that the risk of being exposed to thefts from cars, during the analyzed time period, was highest in Centrum and Hagalund and their surroundings. The lowest risk of being exposed to this type of crime was found in Domnarvet and Islingby, during the year 2002. The highest risk of being a victim of the crime office burglary was found in Hagalund and its surroundings and in the single area of Kvarnsveden. The corresponding lowest risk was found in Lergärdet and its surroundings and in Norra Backa and Kupolen. The measures that should be taken in order to decrease these types of criminality can be divided into overall changes and place-specific changes. When it comes to the crime thefts from cars a more attractive central business district, a better view of parking lots from nearby buildings, dividing of larger parking lot zones into smaller ones, migration of hidden parking lots and stronger access control to parking lots where problems with this kind of crime have occurred have been suggested as overall changes. The corresponding place-specific changes are to remove vegetation that is blocking the view, better lighting and to put up signs with information about increased risk of exposure to crime at parking lots with the most problems. To decrease the amount of office burglaries overall changes as to create a better view of the area from nearby surroundings, move bigger office compartments or divide them into smaller units, rebuild characteristic buildings and increase security by strengthening the access control to offices with these kinds of problems could be useful. Finally there are possibilities to decrease office burglary by using place-specific measures as surveillance cameras combined with signs containing information about these, high fences and better lighting around the buildings where a higher risk of being exposed to this kind of criminality is present.
Resumo:
Woodworking industries still consists of wood dust problems. Young workers are especially vulnerable to safety risks. To reduce risks, it is important to change attitudes and increase knowledge about safety. Safety training have shown to establish positive attitudes towards safety among employees. The aim of current study is to analyze the effect of QR codes that link to Picture Mix EXposure (PIMEX) videos by analyzing attitudes to this safety training method and safety in student responses. Safety training videos were used in upper secondary school handicraft programs to demonstrate wood dust risks and methods to decrease exposure to wood dust. A preliminary study was conducted to investigate improvement of safety training in two schools in preparation for the main study that investigated a safety training method in three schools. In the preliminary study the PIMEX method was first used in which students were filmed while wood dust exposure was measured and subsequently displayed on a computer screen in real time. Before and after the filming, teachers, students, and researchers together analyzed wood dust risks and effective measures to reduce exposure to them. For the main study, QR codes linked to PIMEX videos were attached at wood processing machines. Subsequent interviews showed that this safety training method enables students in an early stage of their life to learn about risks and safety measures to control wood dust exposure. The new combination of methods can create awareness, change attitudes and motivation among students to work more frequently to reduce wood dust.
Resumo:
Messaging middleware provides asynchronous communication between services in distributed environments. However, security, reliability and performance issues compel such middleware to be distributed, and distribution throws up its own problems such as identifying messaging channels which could then be subscribed to. In particular, interested parties need to identify channels defined in remote locations while not knowing details of how they are defined. A common vocabulary using semantic descriptions offers a solution to this problem. In this paper, we describe the design and implementation of federated messaging middleware using semantic description of channels.
Resumo:
Application of optimization algorithm to PDE modeling groundwater remediation can greatly reduce remediation cost. However, groundwater remediation analysis requires a computational expensive simulation, therefore, effective parallel optimization could potentially greatly reduce computational expense. The optimization algorithm used in this research is Parallel Stochastic radial basis function. This is designed for global optimization of computationally expensive functions with multiple local optima and it does not require derivatives. In each iteration of the algorithm, an RBF is updated based on all the evaluated points in order to approximate expensive function. Then the new RBF surface is used to generate the next set of points, which will be distributed to multiple processors for evaluation. The criteria of selection of next function evaluation points are estimated function value and distance from all the points known. Algorithms created for serial computing are not necessarily efficient in parallel so Parallel Stochastic RBF is different algorithm from its serial ancestor. The application for two Groundwater Superfund Remediation sites, Umatilla Chemical Depot, and Former Blaine Naval Ammunition Depot. In the study, the formulation adopted treats pumping rates as decision variables in order to remove plume of contaminated groundwater. Groundwater flow and contamination transport is simulated with MODFLOW-MT3DMS. For both problems, computation takes a large amount of CPU time, especially for Blaine problem, which requires nearly fifty minutes for a simulation for a single set of decision variables. Thus, efficient algorithm and powerful computing resource are essential in both cases. The results are discussed in terms of parallel computing metrics i.e. speedup and efficiency. We find that with use of up to 24 parallel processors, the results of the parallel Stochastic RBF algorithm are excellent with speed up efficiencies close to or exceeding 100%.