871 resultados para Large-scale Distribution


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large-scale mechanical products, such as aircraft and rockets, consist of large numbers of small components, which introduce additional difficulty for assembly accuracy and error estimation. Planar surfaces as key product characteristics are usually utilised for positioning small components in the assembly process. This paper focuses on assembly accuracy analysis of small components with planar surfaces in large-scale volume products. To evaluate the accuracy of the assembly system, an error propagation model for measurement error and fixture error is proposed, based on the assumption that all errors are normally distributed. In this model, the general coordinate vector is adopted to represent the position of the components. The error transmission functions are simplified into a linear model, and the coordinates of the reference points are composed by theoretical value and random error. The installation of a Head-Up Display is taken as an example to analyse the assembly error of small components based on the propagation model. The result shows that the final coordination accuracy is mainly determined by measurement error of the planar surface in small components. To reduce the uncertainty of the plane measurement, an evaluation index of measurement strategy is presented. This index reflects the distribution of the sampling point set and can be calculated by an inertia moment matrix. Finally, a practical application is introduced for validating the evaluation index.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Funded by The research presented in this paper is part of the SINBAD project. Grant Number: STW (12058) and EPSRC (EP/J00507X/1, EP/J005541/1)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many modern applications fall into the category of "large-scale" statistical problems, in which both the number of observations n and the number of features or parameters p may be large. Many existing methods focus on point estimation, despite the continued relevance of uncertainty quantification in the sciences, where the number of parameters to estimate often exceeds the sample size, despite huge increases in the value of n typically seen in many fields. Thus, the tendency in some areas of industry to dispense with traditional statistical analysis on the basis that "n=all" is of little relevance outside of certain narrow applications. The main result of the Big Data revolution in most fields has instead been to make computation much harder without reducing the importance of uncertainty quantification. Bayesian methods excel at uncertainty quantification, but often scale poorly relative to alternatives. This conflict between the statistical advantages of Bayesian procedures and their substantial computational disadvantages is perhaps the greatest challenge facing modern Bayesian statistics, and is the primary motivation for the work presented here.

Two general strategies for scaling Bayesian inference are considered. The first is the development of methods that lend themselves to faster computation, and the second is design and characterization of computational algorithms that scale better in n or p. In the first instance, the focus is on joint inference outside of the standard problem of multivariate continuous data that has been a major focus of previous theoretical work in this area. In the second area, we pursue strategies for improving the speed of Markov chain Monte Carlo algorithms, and characterizing their performance in large-scale settings. Throughout, the focus is on rigorous theoretical evaluation combined with empirical demonstrations of performance and concordance with the theory.

One topic we consider is modeling the joint distribution of multivariate categorical data, often summarized in a contingency table. Contingency table analysis routinely relies on log-linear models, with latent structure analysis providing a common alternative. Latent structure models lead to a reduced rank tensor factorization of the probability mass function for multivariate categorical data, while log-linear models achieve dimensionality reduction through sparsity. Little is known about the relationship between these notions of dimensionality reduction in the two paradigms. In Chapter 2, we derive several results relating the support of a log-linear model to nonnegative ranks of the associated probability tensor. Motivated by these findings, we propose a new collapsed Tucker class of tensor decompositions, which bridge existing PARAFAC and Tucker decompositions, providing a more flexible framework for parsimoniously characterizing multivariate categorical data. Taking a Bayesian approach to inference, we illustrate empirical advantages of the new decompositions.

Latent class models for the joint distribution of multivariate categorical, such as the PARAFAC decomposition, data play an important role in the analysis of population structure. In this context, the number of latent classes is interpreted as the number of genetically distinct subpopulations of an organism, an important factor in the analysis of evolutionary processes and conservation status. Existing methods focus on point estimates of the number of subpopulations, and lack robust uncertainty quantification. Moreover, whether the number of latent classes in these models is even an identified parameter is an open question. In Chapter 3, we show that when the model is properly specified, the correct number of subpopulations can be recovered almost surely. We then propose an alternative method for estimating the number of latent subpopulations that provides good quantification of uncertainty, and provide a simple procedure for verifying that the proposed method is consistent for the number of subpopulations. The performance of the model in estimating the number of subpopulations and other common population structure inference problems is assessed in simulations and a real data application.

In contingency table analysis, sparse data is frequently encountered for even modest numbers of variables, resulting in non-existence of maximum likelihood estimates. A common solution is to obtain regularized estimates of the parameters of a log-linear model. Bayesian methods provide a coherent approach to regularization, but are often computationally intensive. Conjugate priors ease computational demands, but the conjugate Diaconis--Ylvisaker priors for the parameters of log-linear models do not give rise to closed form credible regions, complicating posterior inference. In Chapter 4 we derive the optimal Gaussian approximation to the posterior for log-linear models with Diaconis--Ylvisaker priors, and provide convergence rate and finite-sample bounds for the Kullback-Leibler divergence between the exact posterior and the optimal Gaussian approximation. We demonstrate empirically in simulations and a real data application that the approximation is highly accurate, even in relatively small samples. The proposed approximation provides a computationally scalable and principled approach to regularized estimation and approximate Bayesian inference for log-linear models.

Another challenging and somewhat non-standard joint modeling problem is inference on tail dependence in stochastic processes. In applications where extreme dependence is of interest, data are almost always time-indexed. Existing methods for inference and modeling in this setting often cluster extreme events or choose window sizes with the goal of preserving temporal information. In Chapter 5, we propose an alternative paradigm for inference on tail dependence in stochastic processes with arbitrary temporal dependence structure in the extremes, based on the idea that the information on strength of tail dependence and the temporal structure in this dependence are both encoded in waiting times between exceedances of high thresholds. We construct a class of time-indexed stochastic processes with tail dependence obtained by endowing the support points in de Haan's spectral representation of max-stable processes with velocities and lifetimes. We extend Smith's model to these max-stable velocity processes and obtain the distribution of waiting times between extreme events at multiple locations. Motivated by this result, a new definition of tail dependence is proposed that is a function of the distribution of waiting times between threshold exceedances, and an inferential framework is constructed for estimating the strength of extremal dependence and quantifying uncertainty in this paradigm. The method is applied to climatological, financial, and electrophysiology data.

The remainder of this thesis focuses on posterior computation by Markov chain Monte Carlo. The Markov Chain Monte Carlo method is the dominant paradigm for posterior computation in Bayesian analysis. It has long been common to control computation time by making approximations to the Markov transition kernel. Comparatively little attention has been paid to convergence and estimation error in these approximating Markov Chains. In Chapter 6, we propose a framework for assessing when to use approximations in MCMC algorithms, and how much error in the transition kernel should be tolerated to obtain optimal estimation performance with respect to a specified loss function and computational budget. The results require only ergodicity of the exact kernel and control of the kernel approximation accuracy. The theoretical framework is applied to approximations based on random subsets of data, low-rank approximations of Gaussian processes, and a novel approximating Markov chain for discrete mixture models.

Data augmentation Gibbs samplers are arguably the most popular class of algorithm for approximately sampling from the posterior distribution for the parameters of generalized linear models. The truncated Normal and Polya-Gamma data augmentation samplers are standard examples for probit and logit links, respectively. Motivated by an important problem in quantitative advertising, in Chapter 7 we consider the application of these algorithms to modeling rare events. We show that when the sample size is large but the observed number of successes is small, these data augmentation samplers mix very slowly, with a spectral gap that converges to zero at a rate at least proportional to the reciprocal of the square root of the sample size up to a log factor. In simulation studies, moderate sample sizes result in high autocorrelations and small effective sample sizes. Similar empirical results are observed for related data augmentation samplers for multinomial logit and probit models. When applied to a real quantitative advertising dataset, the data augmentation samplers mix very poorly. Conversely, Hamiltonian Monte Carlo and a type of independence chain Metropolis algorithm show good mixing on the same dataset.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Graph analytics is an important and computationally demanding class of data analytics. It is essential to balance scalability, ease-of-use and high performance in large scale graph analytics. As such, it is necessary to hide the complexity of parallelism, data distribution and memory locality behind an abstract interface. The aim of this work is to build a scalable graph analytics framework that does not demand significant parallel programming experience based on NUMA-awareness.
The realization of such a system faces two key problems:
(i)~how to develop a scale-free parallel programming framework that scales efficiently across NUMA domains; (ii)~how to efficiently apply graph partitioning in order to create separate and largely independent work items that can be distributed among threads.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Strong convective events can produce extreme precipitation, hail, lightning or gusts, potentially inducing severe socio-economic impacts. These events have a relatively small spatial extension and, in most cases, a short lifetime. In this study, a model is developed for estimating convective extreme events based on large scale conditions. It is shown that strong convective events can be characterized by a Weibull distribution of radar-based rainfall with a low shape and high scale parameter value. A radius of 90km around a station reporting a convective situation turned out to be suitable. A methodology is developed to estimate the Weibull parameters and thus the occurrence probability of convective events from large scale atmospheric instability and enhanced near-surface humidity, which are usually found on a larger scale than the convective event itself. Here, the probability for the occurrence of extreme convective events is estimated from the KO-index indicating the stability, and relative humidity at 1000hPa. Both variables are computed from ERA-Interim reanalysis. In a first version of the methodology, these two variables are applied to estimate the spatial rainfall distribution and to estimate the occurrence of a convective event. The developed method shows significant skill in estimating the occurrence of convective events as observed at synoptic stations, lightning measurements, and severe weather reports. In order to take frontal influences into account, a scheme for the detection of atmospheric fronts is implemented. While generally higher instability is found in the vicinity of fronts, the skill of this approach is largely unchanged. Additional improvements were achieved by a bias-correction and the use of ERA-Interim precipitation. The resulting estimation method is applied to the ERA-Interim period (1979-2014) to establish a ranking of estimated convective extreme events. Two strong estimated events that reveal a frontal influence are analysed in detail. As a second application, the method is applied to GCM-based decadal predictions in the period 1979-2014, which were initialized every year. It is shown that decadal predictive skill for convective event frequencies over Germany is found for the first 3-4 years after the initialization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large scale enzymatic resolution of racemic sulcatol 2 has been useful for stereoselective biocatalysis. This reaction was fast and selective, using vinyl acetate as donor of acyl group and lipase from Candida antarctica (CALB) as catalyst. The large scale reaction (5.0 g, 39 mmol) afforded high optical purities for S-(+)-sulcatol 2 and R-(+)-sulcatyl acetate 3, i.e., ee > 99 per cent and good yields (45 per cent) within a short time (40 min). Thermodynamic parameters for the chemoesterification of sulcatol 2 by vinyl acetate were evaluated. The enthalpy and Gibbs free energy values of this reaction were negative, indicating that this process is exothermic and spontaneous which is in agreement with the reaction obtained enzymatically.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past 150 years, Brazil has played a pioneering role in developing environmental policies and pursuing forest conservation and ecological restoration of degraded ecosystems. In particular, the Brazilian Forest Act, first drafted in 1934, has been fundamental in reducing deforestation and engaging private land owners in forest restoration initiatives. At the time of writing (December 2010), however, a proposal for major revision of the Brazilian Forest Act is under intense debate in the National Assembly, and we are deeply concerned about the outcome. On the basis of the analysis of detailed vegetation and hydrographic maps, we estimate that the proposed changes may reduce the total amount of potential areas for restoration in the Atlantic Forest by approximately 6 million hectares. As a radically different policy model, we present the Atlantic Forest Restoration Pact (AFRP), which is a group of more than 160 members that represents one of the most important and ambitious ecological restoration programs in the world. The AFRP aims to restore 15 million hectares of degraded lands in the Brazilian Atlantic Forest biome by 2050 and increase the current forest cover of the biome from 17% to at least 30%. We argue that not only should Brazilian lawmakers refrain from revising the existing Forest Law, but also greatly step up investments in the science, business, and practice of ecological restoration throughout the country, including the Atlantic Forest. The AFRP provides a template that could be adapted to other forest biomes in Brazil and to other megadiversity countries around the world.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The complex interactions among endangered ecosystems, landowners` interests, and different models of land tenure and use, constitute an important series of challenges for those seeking to maintain and restore biodiversity and augment the flow of ecosystem services. Over the past 10 years, we have developed a data-based approach to address these challenges and to achieve medium and large-scale ecological restoration of riparian areas on private lands in the state of Sao Paulo, southeastern Brazil. Given varying motivations for ecological restoration, the location of riparian areas within landholdings, environmental zoning of different riparian areas, and best-practice restoration methods were developed for each situation. A total of 32 ongoing projects, covering 527,982 ha, were evaluated in large sugarcane farms and small mixed farms, and six different restoration techniques have been developed to help upscale the effort. Small mixed farms had higher portions of land requiring protection as riparian areas (13.3%), and lower forest cover of riparian areas (18.3%), than large sugarcane farms (10.0% and 36.9%, respectively for riparian areas and forest cover values). In both types of farms, forest fragments required some degree of restoration. Historical anthropogenic degradation has compromised forest ecosystem structure and functioning, despite their high-diversity of native tree and shrub species. Notably, land use patterns in riparian areas differed markedly. Large sugarcane farms had higher portions of riparian areas occupied by highly mechanized agriculture, abandoned fields, and anthropogenic wet fields created by siltation in water courses. In contrast, in small mixed crop farms, low or non-mechanized agriculture and pasturelands were predominant. Despite these differences, plantations of native tree species covering the entire area was by far the main restoration method needed both by large sugarcane farms (76.0%) and small mixed farms (92.4%), in view of the low resilience of target sites, reduced forest cover, and high fragmentation, all of which limit the potential for autogenic restoration. We propose that plantations should be carried out with a high-diversity of native species in order to create biologically viable restored forests, and to assist long-term biodiversity persistence at the landscape scale. Finally, we propose strategies to integrate the political, socio-economic and methodological aspects needed to upscale restoration efforts in tropical forest regions throughout Latin America and elsewhere. (C) 2010 Elsevier BA/. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The scaled-up preparation of 1H-pyrazole, 1-phenylpyrazole and isoxazole via sonocatalysis is reported. The products were isolated in good yields in short time reaction. These compounds had been assayed for antioxidant activity by ORAC and DPPH methodologies. The results showed that only 1-phenylpyrazole presented good antioxidant activity compared with Trolox(R).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study was designed to test the utility of a stress-coping model of employee adjustment to organisational change. Specifically, it was proposed that employee adjustment to this type of work stress would be influenced by the characteristics of the change situation, employees' appraisals of the situation, their coping strategies, and the extent of their personal resources. Data were collected from 140 middle managers and supervisors involved in a large-scale public sector integration. The results of the research provided some support for the proposed model: high levels of psychological distress were related to a reliance on informal sources of information, high appraised stress, low appraised certainty, and the use of avoidant rather than problem-focused strategies, whereas poor social functioning was associated with low self-esteem, high levels or disruption across the period of change, a reliance on informal sources of information, and the use of avoidant coping strategies. There was no evidence that coping strategies mediated the effects of the event characteristics, situational appraisals, and personal resources on adjustment; however, there was some evidence linking these variables to coping strategies, in particular, problem-focused coping. There was also some evidence to indicate that the experience of organisational change was different for managers and supervisors: levels of threat were higher for the managers than the supervisors, but there was no difference between the groups of employees in terms of adjustment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advent of functional neuroimaging techniques, in particular functional magnetic resonance imaging (fMRI), we have gained greater insight into the neural correlates of visuospatial function. However, it may not always be easy to identify the cerebral regions most specifically associated with performance on a given task. One approach is to examine the quantitative relationships between regional activation and behavioral performance measures. In the present study, we investigated the functional neuroanatomy of two different visuospatial processing tasks, judgement of line orientation and mental rotation. Twenty-four normal participants were scanned with fMRI using blocked periodic designs for experimental task presentation. Accuracy and reaction time (RT) to each trial of both activation and baseline conditions in each experiment was recorded. Both experiments activated dorsal and ventral visual cortical areas as well as dorsolateral prefrontal cortex. More regionally specific associations with task performance were identified by estimating the association between (sinusoidal) power of functional response and mean RT to the activation condition; a permutation test based on spatial statistics was used for inference. There was significant behavioral-physiological association in right ventral extrastriate cortex for the line orientation task and in bilateral (predominantly right) superior parietal lobule for the mental rotation task. Comparable associations were not found between power of response and RT to the baseline conditions of the tasks. These data suggest that one region in a neurocognitive network may be most strongly associated with behavioral performance and this may be regarded as the computationally least efficient or rate-limiting node of the network.