882 resultados para formation of large scale structure


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Heterogeneous datasets arise naturally in most applications due to the use of a variety of sensors and measuring platforms. Such datasets can be heterogeneous in terms of the error characteristics and sensor models. Treating such data is most naturally accomplished using a Bayesian or model-based geostatistical approach; however, such methods generally scale rather badly with the size of dataset, and require computationally expensive Monte Carlo based inference. Recently within the machine learning and spatial statistics communities many papers have explored the potential of reduced rank representations of the covariance matrix, often referred to as projected or fixed rank approaches. In such methods the covariance function of the posterior process is represented by a reduced rank approximation which is chosen such that there is minimal information loss. In this paper a sequential Bayesian framework for inference in such projected processes is presented. The observations are considered one at a time which avoids the need for high dimensional integrals typically required in a Bayesian approach. A C++ library, gptk, which is part of the INTAMAP web service, is introduced which implements projected, sequential estimation and adds several novel features. In particular the library includes the ability to use a generic observation operator, or sensor model, to permit data fusion. It is also possible to cope with a range of observation error characteristics, including non-Gaussian observation errors. Inference for the covariance parameters is explored, including the impact of the projected process approximation on likelihood profiles. We illustrate the projected sequential method in application to synthetic and real datasets. Limitations and extensions are discussed. © 2010 Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large-scale mechanical products, such as aircraft and rockets, consist of large numbers of small components, which introduce additional difficulty for assembly accuracy and error estimation. Planar surfaces as key product characteristics are usually utilised for positioning small components in the assembly process. This paper focuses on assembly accuracy analysis of small components with planar surfaces in large-scale volume products. To evaluate the accuracy of the assembly system, an error propagation model for measurement error and fixture error is proposed, based on the assumption that all errors are normally distributed. In this model, the general coordinate vector is adopted to represent the position of the components. The error transmission functions are simplified into a linear model, and the coordinates of the reference points are composed by theoretical value and random error. The installation of a Head-Up Display is taken as an example to analyse the assembly error of small components based on the propagation model. The result shows that the final coordination accuracy is mainly determined by measurement error of the planar surface in small components. To reduce the uncertainty of the plane measurement, an evaluation index of measurement strategy is presented. This index reflects the distribution of the sampling point set and can be calculated by an inertia moment matrix. Finally, a practical application is introduced for validating the evaluation index.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The seminal multiple-view stereo benchmark evaluations from Middlebury and by Strecha et al. have played a major role in propelling the development of multi-view stereopsis (MVS) methodology. The somewhat small size and variability of these data sets, however, limit their scope and the conclusions that can be derived from them. To facilitate further development within MVS, we here present a new and varied data set consisting of 80 scenes, seen from 49 or 64 accurate camera positions. This is accompanied by accurate structured light scans for reference and evaluation. In addition all images are taken under seven different lighting conditions. As a benchmark and to validate the use of our data set for obtaining reasonable and statistically significant findings about MVS, we have applied the three state-of-the-art MVS algorithms by Campbell et al., Furukawa et al., and Tola et al. to the data set. To do this we have extended the evaluation protocol from the Middlebury evaluation, necessitated by the more complex geometry of some of our scenes. The data set and accompanying evaluation framework are made freely available online. Based on this evaluation, we are able to observe several characteristics of state-of-the-art MVS, e.g. that there is a tradeoff between the quality of the reconstructed 3D points (accuracy) and how much of an object’s surface is captured (completeness). Also, several issues that we hypothesized would challenge MVS, such as specularities and changing lighting conditions did not pose serious problems. Our study finds that the two most pressing issues for MVS are lack of texture and meshing (forming 3D points into closed triangulated surfaces).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the treatment and monitoring of Parkinson's disease (PD) to be scientific, a key requirement is that measurement of disease stages and severity is quantitative, reliable, and repeatable. The last 50 years in PD research have been dominated by qualitative, subjective ratings obtained by human interpretation of the presentation of disease signs and symptoms at clinical visits. More recently, “wearable,” sensor-based, quantitative, objective, and easy-to-use systems for quantifying PD signs for large numbers of participants over extended durations have been developed. This technology has the potential to significantly improve both clinical diagnosis and management in PD and the conduct of clinical studies. However, the large-scale, high-dimensional character of the data captured by these wearable sensors requires sophisticated signal processing and machine-learning algorithms to transform it into scientifically and clinically meaningful information. Such algorithms that “learn” from data have shown remarkable success in making accurate predictions for complex problems in which human skill has been required to date, but they are challenging to evaluate and apply without a basic understanding of the underlying logic on which they are based. This article contains a nontechnical tutorial review of relevant machine-learning algorithms, also describing their limitations and how these can be overcome. It discusses implications of this technology and a practical road map for realizing the full potential of this technology in PD research and practice. © 2016 International Parkinson and Movement Disorder Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The short-term effects of fiscal consolidation have attracted an increasing attention from both the academia and policy makers in the recent years. Authors in the literature on non- Keynesian effects usually put the emphasis on the need for the devaluation of the national currency, the accommodating reaction of the monetary authority and the favourable international economic conditions as the necessary accompanying tools of fiscal consolidation, in order to realise short-term expansionary effects. Some also add the necessity of large-scale adjustment; while others support the view that a high and increasing debt ratio or increasing government spending, by triggering an unavoidable adjustment, is the key to experiencing short-term expansionary effects. The composition of adjustment also became a crucial explanation for non-Keynesian effects. However, as the following critical assessment of the literature on expansionary fiscal consolidations will reveal, institutional conditions, such as the importance of the depth of financial intermediation and the influencing role of labour market structure, can prove to be crucial in the occurrence of the desired expansionary short-term effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation examines the sociological process of conflict resolution and consensus building in South Florida Everglades Ecosystem Restoration through what I define as a Network Management Coordinative Interstitial Group (NetMIG). The process of conflict resolution can be summarized as the participation of interested and affected parties (stakeholders) in a forum of negotiation. I study the case of the Governor's Commission for a Sustainable South Florida (GCSSF) that was established to reduce social conflict. Such conflict originated from environmental disputes about the Everglades and was manifested in the form of gridlock among regulatory (government) agencies, Indian tribes, as well as agricultural, environmental conservationist and urban development interests. The purpose of the participatory forum is to reduce conflicts of interest and to achieve consensus, with the ultimate goal of restoration of the original Everglades ecosystem, while cultivating the economic and cultural bases of the communities in the area. Further, the forum aim to formulate consensus through envisioning a common sustainable community by providing means to achieve a balance between human and natural systems. ^ Data were gathered using participant observation and document analysis techniques to conduct a theoretically based analysis of the role of the Network Management Coordinative Interstitial Group (NetMIG). I use conflict resolution theory, environmental conflict theory, stakeholder analysis, systems theory, differentiation and social change theory, and strategic management and planning theory. ^ The purpose of this study is to substantiate the role of the Governor's Commission for a Sustainable South Florida (GCSSF) as a consortium of organizations in an effort to resolve conflict rather than an ethnographic study of this organization. Environmental restoration of the Everglades is a vehicle for recognizing the significance of a Network Management Coordinative Interstitial Group (NetMIG), namely the Governor's Commission for a Sustainable South Florida (GCSSF), as a structural mechanism for stakeholder participation in the process of social conflict resolution through the creation of new cultural paradigms for a sustainable community. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gasoline oxygenates (MTBE, methyl tert-butyl ether; DIPE, di-isopropyl ether; ETBE, ethyl tert-butyl ether; TAME, tert-amyl ether) are added to gasoline to boost octane and enhance combustion. The combination of large scale use, high water solubility and only minor biodegradability has now resulted in a significant gasoline oxygenate contamination occurring in surface, ground, and drinking water systems. Combination of hydroxyl radical formation and the pyrolytic environment generated by ultrasonic irradiation (665 kHz) leads to the rapid degradation of MTBE and other gasoline oxygenates in aqueous media. ^ The presence of oxygen promotes the degradation processes by rapid reaction with carbon centered radicals indicating radical processes involving O 2 are significant pathways. A number of the oxidation products were identified. The formation of products (alcohols, ketones, aldehydes, esters, peroxides, etc) could be rationalized by mechanisms which involve hydrogen abstraction by OH radical and/or pyrolysis to form carboncentered radicals which react with oxygen and follow standard oxidation chain processes. ^ The reactions of N-substituted R-triazolinediones (RTAD; R = CH 3 or phenyl) have attracted considerable interest because they exhibit a number of unusual mechanistic characteristics that are analogous to the reactions of singlet oxygen (1O2) and offer an easy way to provide C-N bond(s) formation. The reactions of triazolinedione with olefins have been widely studied and aziridinium imides are generally accepted to be the reactive intermediates. ^ We observed the rapid formation of an unusual intermediate upon mixing tetracyclopropylethylene with 4-methyl-1,2,4-triazoline-3,5-dione in CDCl 3. Detailed characterization by NMR (proton, 13C, 2-D NMRs) indicates the intermediate is 5,5,6,6-tetracyclopropyl-3-methyl-5,6-dihydro-oxazolo[3,2- b][1,2,4]-triazolium-2-olate. Such products are extremely rare and have not been studied. Upon warming the intermediate is converted to 2 + 2 diazetidine (major) and ene product (minor). ^ To further explore the kinetics and dynamics of the reaction activation energies were obtained using Arrhenius plots. Activation energies for the formation of the intermediate from reactants, and 2+2 adduct from the intermediate were determined as 7.48 kcal moll and 19.8 kcal mol−1 with their pre-exponential values of 2.24 × 105 dm 3 mol−1 sec−1 and 2.75 × 108 sec−1, respectively, meaning net slow reactions because of low pre-exponential values caused by steric hindrance. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this investigation was to develop new techniques to generate segmental assessments of body composition based on Segmental Bioelectrical Impedance Analysis (SBIA). An equally important consideration was the design, simulation, development, and the software and hardware integration of the SBIA system. This integration was carried out with a Very Large Scale Integration (VLSI) Field Programmable Gate Array (FPGA) microcontroller that analyzed the measurements obtained from segments of the body, and provided full body and segmental Fat Free Mass (FFM) and Fat Mass (FM) percentages. Also, the issues related to the estimate of the body's composition in persons with spinal cord injury (SCI) were addressed and investigated. This investigation demonstrated that the SBIA methodology provided accurate segmental body composition measurements. Disabled individuals are expected to benefit from these SBIA evaluations, as they are non-invasive methods, suitable for paralyzed individuals. The SBIA VLSI system may replace bulky, non flexible electronic modules attached to human bodies. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Internet Protocol Television (IPTV) is a system where a digital television service is delivered by using Internet Protocol over a network infrastructure. There is considerable confusion and concern about the IPTV, since two different technologies have to be mended together to provide the end customers with some thing better than the conventional television. In this research, functional architecture of the IPTV system was investigated. Very Large Scale Integration based system for streaming server controller were designed and different ways of hosting a web server which can be used to send the control signals to the streaming server controller were studied. The web server accepts inputs from the keyboard and FPGA board switches and depending on the preset configuration the server will open a selected web page and also sends the control signals to the streaming server controller. It was observed that the applications run faster on PowerPC since it is embedded into the FPGA. Commercial market and Global deployment of IPTV were discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As massive data sets become increasingly available, people are facing the problem of how to effectively process and understand these data. Traditional sequential computing models are giving way to parallel and distributed computing models, such as MapReduce, both due to the large size of the data sets and their high dimensionality. This dissertation, as in the same direction of other researches that are based on MapReduce, tries to develop effective techniques and applications using MapReduce that can help people solve large-scale problems. Three different problems are tackled in the dissertation. The first one deals with processing terabytes of raster data in a spatial data management system. Aerial imagery files are broken into tiles to enable data parallel computation. The second and third problems deal with dimension reduction techniques that can be used to handle data sets of high dimensionality. Three variants of the nonnegative matrix factorization technique are scaled up to factorize matrices of dimensions in the order of millions in MapReduce based on different matrix multiplication implementations. Two algorithms, which compute CANDECOMP/PARAFAC and Tucker tensor decompositions respectively, are parallelized in MapReduce based on carefully partitioning the data and arranging the computation to maximize data locality and parallelism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the relationships between hydrology and salinity and plant community structure and production is critical to allow predictions of wetland responses to altered water management, changing precipitation patterns and rising sea-level. We addressed how salinity, water depth, hydroperiod, canal inflows, and local precipitation control marsh macrophyte aboveground net primary production (ANPP) and structure in the coastal ecotone of the southern Everglades. We contrasted responses in two watersheds - Taylor Slough (TS) and C-111 - systems that have and will continue to experience changes in water management. Based on long-term trajectories in plant responses, we found continued evidence of increasing water levels and length of inundation in the C-111 watershed south of the C-111 canal. We also found strong differentiation among sites in upper TS that was dependent on hydrology. Finally, salinity, local precipitation and freshwater discharge from upstream explained over 80 % of the variance in Cladium ANPP at a brackish water site in TS. Moreover, our study showed that, while highly managed, the TS and C-111 watersheds maintain legacies in spatial pattern that would facilitate hydrologic restoration. Based on the trajectories in Cladium and Eleocharis, shifts in plant community structure could occur within 5–10 years of sustained water management change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The social media classification problems draw more and more attention in the past few years. With the rapid development of Internet and the popularity of computers, there is astronomical amount of information in the social network (social media platforms). The datasets are generally large scale and are often corrupted by noise. The presence of noise in training set has strong impact on the performance of supervised learning (classification) techniques. A budget-driven One-class SVM approach is presented in this thesis that is suitable for large scale social media data classification. Our approach is based on an existing online One-class SVM learning algorithm, referred as STOCS (Self-Tuning One-Class SVM) algorithm. To justify our choice, we first analyze the noise-resilient ability of STOCS using synthetic data. The experiments suggest that STOCS is more robust against label noise than several other existing approaches. Next, to handle big data classification problem for social media data, we introduce several budget driven features, which allow the algorithm to be trained within limited time and under limited memory requirement. Besides, the resulting algorithm can be easily adapted to changes in dynamic data with minimal computational cost. Compared with two state-of-the-art approaches, Lib-Linear and kNN, our approach is shown to be competitive with lower requirements of memory and time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Standard Cosmological Model is generally accepted by the scientific community, there are still an amount of unresolved issues. From the observable characteristics of the structures in the Universe,it should be possible to impose constraints on the cosmological parameters. Cosmic Voids (CV) are a major component of the LSS and have been shown to possess great potential for constraining DE and testing theories of gravity. But a gap between CV observations and theory still persists. A theoretical model for void statistical distribution as a function of size exists (SvdW) However, the SvdW model has been unsuccesful in reproducing the results obtained from cosmological simulations. This undermines the possibility of using voids as cosmological probes. The goal of our thesis work is to cover the gap between theoretical predictions and measured distributions of cosmic voids. We develop an algorithm to identify voids in simulations,consistently with theory. We inspecting the possibilities offered by a recently proposed refinement of the SvdW (the Vdn model, Jennings et al., 2013). Comparing void catalogues to theory, we validate the Vdn model, finding that it is reliable over a large range of radii, at all the redshifts considered and for all the cosmological models inspected. We have then searched for a size function model for voids identified in a distribution of biased tracers. We find that, naively applying the same procedure used for the unbiased tracers to a halo mock distribution does not provide success- full results, suggesting that the Vdn model requires to be reconsidered when dealing with biased samples. Thus, we test two alternative exten- sions of the model and find that two scaling relations exist: both the Dark Matter void radii and the underlying Dark Matter density contrast scale with the halo-defined void radii. We use these findings to develop a semi-analytical model which gives promising results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based on a well-established stratigraphic framework and 47 AMS-14C dated sediment cores, the distribution of facies types on the NW Iberian margin is analysed in response to the last deglacial sea-level rise, thus providing a case study on the sedimentary evolution of a high-energy, low-accumulation shelf system. Altogether, four main types of sedimentary facies are defined. (1) A gravel-dominated facies occurs mostly as time-transgressive ravinement beds, which initially developed as shoreface and storm deposits in shallow waters on the outer shelf during the last sea-level lowstand; (2) A widespread, time-transgressive mixed siliceous/biogenic-carbonaceous sand facies indicates areas of moderate hydrodynamic regimes, high contribution of reworked shelf material, and fluvial supply to the shelf; (3) A glaucony-containing sand facies in a stationary position on the outer shelf formed mostly during the last-glacial sea-level rise by reworking of older deposits as well as authigenic mineral formation; and (4) A mud facies is mostly restricted to confined Holocene fine-grained depocentres, which are located in mid-shelf position. The observed spatial and temporal distribution of these facies types on the high-energy, low-accumulation NW Iberian shelf was essentially controlled by the local interplay of sediment supply, shelf morphology, and strength of the hydrodynamic system. These patterns are in contrast to high-accumulation systems where extensive sediment supply is the dominant factor on the facies distribution. This study emphasises the importance of large-scale erosion and material recycling on the sedimentary buildup during the deglacial drowning of the shelf. The presence of a homogenous and up to 15-m thick transgressive cover above a lag horizon contradicts the common assumption of sparse and laterally confined sediment accumulation on high-energy shelf systems during deglacial sea-level rise. In contrast to this extensive sand cover, laterally very confined and maximal 4-m thin mud depocentres developed during the Holocene sea-level highstand. This restricted formation of fine-grained depocentres was related to the combination of: (1) frequently occurring high-energy hydrodynamic conditions; (2) low overall terrigenous input by the adjacent rivers; and (3) the large distance of the Galicia Mud Belt to its main sediment supplier.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ACKNOWLEDGEMENTS This research is based upon work supported in part by the U.S. ARL and U.K. Ministry of Defense under Agreement Number W911NF-06-3-0001, and by the NSF under award CNS-1213140. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views or represent the official policies of the NSF, the U.S. ARL, the U.S. Government, the U.K. Ministry of Defense or the U.K. Government. The U.S. and U.K. Governments are authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation hereon.