889 resultados para LARGE-SCALE STRUCTURE OF UNIVERSE
Resumo:
The seminal multiple view stereo benchmark evaluations from Middlebury and by Strecha et al. have played a major role in propelling the development of multi-view stereopsis methodology. Although seminal, these benchmark datasets are limited in scope with few reference scenes. Here, we try to take these works a step further by proposing a new multi-view stereo dataset, which is an order of magnitude larger in number of scenes and with a significant increase in diversity. Specifically, we propose a dataset containing 80 scenes of large variability. Each scene consists of 49 or 64 accurate camera positions and reference structured light scans, all acquired by a 6-axis industrial robot. To apply this dataset we propose an extension of the evaluation protocol from the Middlebury evaluation, reflecting the more complex geometry of some of our scenes. The proposed dataset is used to evaluate the state of the art multiview stereo algorithms of Tola et al., Campbell et al. and Furukawa et al. Hereby we demonstrate the usability of the dataset as well as gain insight into the workings and challenges of multi-view stereopsis. Through these experiments we empirically validate some of the central hypotheses of multi-view stereopsis, as well as determining and reaffirming some of the central challenges.
Resumo:
With ever-more demanding requirements for the accurate manufacture of large components, dimensional measuring techniques are becoming progressively more sophisticated. This review describes some of the more recently developed techniques and the state-of-the-art in the more well-known large-scale dimensional metrology methods. In some cases, the techniques are described in detail, or, where relevant specialist review papers exist, these are cited as further reading. The traceability of the measurement data collected is discussed with reference to new international standards that are emerging. In some cases, hybrid measurement techniques are finding specialized applications and these are referred to where appropriate. © IMechE 2009.
Resumo:
Society depends on complex IT systems created by integrating and orchestrating independently managed systems. The incredible increase in scale and complexity in them over the past decade means new software-engineering techniques are needed to help us cope with their inherent complexity. The key characteristic of these systems is that they are assembled from other systems that are independently controlled and managed. While there is increasing awareness in the software engineering community of related issues, the most relevant background work comes from systems engineering. The interacting algos that led to the Flash Crash represent an example of a coalition of systems, serving the purposes of their owners and cooperating only because they have to. The owners of the individual systems were competing finance companies that were often mutually hostile. Each system jealously guarded its own information and could change without consulting any other system.
Resumo:
Large-scale mechanical products, such as aircraft and rockets, consist of large numbers of small components, which introduce additional difficulty for assembly accuracy and error estimation. Planar surfaces as key product characteristics are usually utilised for positioning small components in the assembly process. This paper focuses on assembly accuracy analysis of small components with planar surfaces in large-scale volume products. To evaluate the accuracy of the assembly system, an error propagation model for measurement error and fixture error is proposed, based on the assumption that all errors are normally distributed. In this model, the general coordinate vector is adopted to represent the position of the components. The error transmission functions are simplified into a linear model, and the coordinates of the reference points are composed by theoretical value and random error. The installation of a Head-Up Display is taken as an example to analyse the assembly error of small components based on the propagation model. The result shows that the final coordination accuracy is mainly determined by measurement error of the planar surface in small components. To reduce the uncertainty of the plane measurement, an evaluation index of measurement strategy is presented. This index reflects the distribution of the sampling point set and can be calculated by an inertia moment matrix. Finally, a practical application is introduced for validating the evaluation index.
Resumo:
The seminal multiple-view stereo benchmark evaluations from Middlebury and by Strecha et al. have played a major role in propelling the development of multi-view stereopsis (MVS) methodology. The somewhat small size and variability of these data sets, however, limit their scope and the conclusions that can be derived from them. To facilitate further development within MVS, we here present a new and varied data set consisting of 80 scenes, seen from 49 or 64 accurate camera positions. This is accompanied by accurate structured light scans for reference and evaluation. In addition all images are taken under seven different lighting conditions. As a benchmark and to validate the use of our data set for obtaining reasonable and statistically significant findings about MVS, we have applied the three state-of-the-art MVS algorithms by Campbell et al., Furukawa et al., and Tola et al. to the data set. To do this we have extended the evaluation protocol from the Middlebury evaluation, necessitated by the more complex geometry of some of our scenes. The data set and accompanying evaluation framework are made freely available online. Based on this evaluation, we are able to observe several characteristics of state-of-the-art MVS, e.g. that there is a tradeoff between the quality of the reconstructed 3D points (accuracy) and how much of an object’s surface is captured (completeness). Also, several issues that we hypothesized would challenge MVS, such as specularities and changing lighting conditions did not pose serious problems. Our study finds that the two most pressing issues for MVS are lack of texture and meshing (forming 3D points into closed triangulated surfaces).
Resumo:
For the treatment and monitoring of Parkinson's disease (PD) to be scientific, a key requirement is that measurement of disease stages and severity is quantitative, reliable, and repeatable. The last 50 years in PD research have been dominated by qualitative, subjective ratings obtained by human interpretation of the presentation of disease signs and symptoms at clinical visits. More recently, “wearable,” sensor-based, quantitative, objective, and easy-to-use systems for quantifying PD signs for large numbers of participants over extended durations have been developed. This technology has the potential to significantly improve both clinical diagnosis and management in PD and the conduct of clinical studies. However, the large-scale, high-dimensional character of the data captured by these wearable sensors requires sophisticated signal processing and machine-learning algorithms to transform it into scientifically and clinically meaningful information. Such algorithms that “learn” from data have shown remarkable success in making accurate predictions for complex problems in which human skill has been required to date, but they are challenging to evaluate and apply without a basic understanding of the underlying logic on which they are based. This article contains a nontechnical tutorial review of relevant machine-learning algorithms, also describing their limitations and how these can be overcome. It discusses implications of this technology and a practical road map for realizing the full potential of this technology in PD research and practice. © 2016 International Parkinson and Movement Disorder Society.
Resumo:
The purpose of this investigation was to develop new techniques to generate segmental assessments of body composition based on Segmental Bioelectrical Impedance Analysis (SBIA). An equally important consideration was the design, simulation, development, and the software and hardware integration of the SBIA system. This integration was carried out with a Very Large Scale Integration (VLSI) Field Programmable Gate Array (FPGA) microcontroller that analyzed the measurements obtained from segments of the body, and provided full body and segmental Fat Free Mass (FFM) and Fat Mass (FM) percentages. Also, the issues related to the estimate of the body's composition in persons with spinal cord injury (SCI) were addressed and investigated. This investigation demonstrated that the SBIA methodology provided accurate segmental body composition measurements. Disabled individuals are expected to benefit from these SBIA evaluations, as they are non-invasive methods, suitable for paralyzed individuals. The SBIA VLSI system may replace bulky, non flexible electronic modules attached to human bodies. ^
Resumo:
Internet Protocol Television (IPTV) is a system where a digital television service is delivered by using Internet Protocol over a network infrastructure. There is considerable confusion and concern about the IPTV, since two different technologies have to be mended together to provide the end customers with some thing better than the conventional television. In this research, functional architecture of the IPTV system was investigated. Very Large Scale Integration based system for streaming server controller were designed and different ways of hosting a web server which can be used to send the control signals to the streaming server controller were studied. The web server accepts inputs from the keyboard and FPGA board switches and depending on the preset configuration the server will open a selected web page and also sends the control signals to the streaming server controller. It was observed that the applications run faster on PowerPC since it is embedded into the FPGA. Commercial market and Global deployment of IPTV were discussed.
Resumo:
As massive data sets become increasingly available, people are facing the problem of how to effectively process and understand these data. Traditional sequential computing models are giving way to parallel and distributed computing models, such as MapReduce, both due to the large size of the data sets and their high dimensionality. This dissertation, as in the same direction of other researches that are based on MapReduce, tries to develop effective techniques and applications using MapReduce that can help people solve large-scale problems. Three different problems are tackled in the dissertation. The first one deals with processing terabytes of raster data in a spatial data management system. Aerial imagery files are broken into tiles to enable data parallel computation. The second and third problems deal with dimension reduction techniques that can be used to handle data sets of high dimensionality. Three variants of the nonnegative matrix factorization technique are scaled up to factorize matrices of dimensions in the order of millions in MapReduce based on different matrix multiplication implementations. Two algorithms, which compute CANDECOMP/PARAFAC and Tucker tensor decompositions respectively, are parallelized in MapReduce based on carefully partitioning the data and arranging the computation to maximize data locality and parallelism.
Resumo:
The research presented in this dissertation investigated selected processes involving baryons and nuclei in hard scattering reactions. These processes are characterized by the production of particles with large energies and transverse momenta. Through these processes, this work explored both, the constituent (quark) structure of baryons (specifically nucleons and Δ-Isobars), and the mechanisms through which the interactions between these constituents ultimately control the selected reactions. The first of such reactions is the hard nucleon-nucleon elastic scattering, which was studied here considering the quark exchange between the nucleons to be the dominant mechanism of interaction in the constituent picture. In particular, it was found that an angular asymmetry exhibited by proton-neutron elastic scattering data is explained within this framework if a quark-diquark picture dominates the nucleon’s structure instead of a more traditional SU(6) three quarks picture. The latter yields an asymmetry around 90o center of mass scattering with a sign opposite to what is experimentally observed. The second process is the hard breakup by a photon of a nucleon-nucleon system in light nuclei. Proton-proton (pp) and proton-neutron (pn) breakup in 3He, and ΔΔ-isobars production in deuteron breakup were analyzed in the hard rescattering model (HRM), which in conjunction with the quark interchange mechanism provides a Quantum Chromodynamics (QCD) description of the reaction. Through the HRM, cross sections for both channels in 3He photodisintegration were computed without the need of a fitting parameter. The results presented here for pp breakup show excellent agreement with recent experimental data. In ΔΔ-isobars production in deuteron breakup, HRM angular distributions for the two ΔΔ channels were compared to the pn channel and to each other. An important prediction fromthis study is that the Δ++Δ- channel consistently dominates Δ+Δ0, which is in contrast with models that unlike the HRM consider a ΔΔ system in the initial state of the interaction. For such models both channels should have the same strength. These results are important in developing a QCD description of the atomic nucleus.
Resumo:
Genetic diversity can be used to describe patterns of gene flow within and between local and regional populations. The Florida Everglades experiences seasonal fluctuations in water level that can influence local population extinction and recolonization dynamics. In addition, this expansive wetland has been divided into water management regions by canals and levees. These combined factors can affect genetic diversity and population structure of aquatic organisms in the Everglades. We analyzed allelic variation at six DNA microsatellite loci to examine the population structure of spotted sunfish (Lepomis punctatus) from the Everglades. We tested the hypothesis that recurrent local extinction and recent regional divisions have had an effect on patterns of genetic diversity. No marked differences were observed in comparisons of the heterozygosity values of sites within and among water management units. No evidence of isolation by distance was detected in a gene flow and distance correlation between subpopulations. Confidence intervals for the estimated F-statistic values crossed zero, indicating that there was no significant genetic difference between subpopulations within a region or between regions. Notably, the genetic variation among subpopulations in a water conservation area was greater than variation among regions (Fsp>FPT). These data indicate that the spatial scale of recolonization following local extinction appears to be most important within water management units.
Resumo:
With the exponential increasing demands and uses of GIS data visualization system, such as urban planning, environment and climate change monitoring, weather simulation, hydrographic gauge and so forth, the geospatial vector and raster data visualization research, application and technology has become prevalent. However, we observe that current web GIS techniques are merely suitable for static vector and raster data where no dynamic overlaying layers. While it is desirable to enable visual explorations of large-scale dynamic vector and raster geospatial data in a web environment, improving the performance between backend datasets and the vector and raster applications remains a challenging technical issue. This dissertation is to implement these challenging and unimplemented areas: how to provide a large-scale dynamic vector and raster data visualization service with dynamic overlaying layers accessible from various client devices through a standard web browser, and how to make the large-scale dynamic vector and raster data visualization service as rapid as the static one. To accomplish these, a large-scale dynamic vector and raster data visualization geographic information system based on parallel map tiling and a comprehensive performance improvement solution are proposed, designed and implemented. They include: the quadtree-based indexing and parallel map tiling, the Legend String, the vector data visualization with dynamic layers overlaying, the vector data time series visualization, the algorithm of vector data rendering, the algorithm of raster data re-projection, the algorithm for elimination of superfluous level of detail, the algorithm for vector data gridding and re-grouping and the cluster servers side vector and raster data caching.
Resumo:
Calcareous floating periphyton mats in the southern Everglades provide habitat for a diverse macroinvertebrate community that has not been well characterized. Our study described this community in an oligotrophic marsh, compared it with the macroinvertebrate community associated with adjacent epiphytic algae attached to macrophytes in the water column, and detected spatial patterns in density and community structure. The floating periphyton mat (floating mat) and epiphytic algae in the water column (submerged epiphyton) were sampled at 4 sites (1 km apart) in northern Shark River Slough, Everglades National Park (ENP), in the early (July) and late (November) wet season. Two perpendicular 90-m transects were established at each site and 100 samples were taken in a nested design. Sites were located in wet-prairie spikerush-dominated sloughs with similar water depths and emergent macrophyte communities. Floating mats were sampled by taking cores (6-cm diameter) that were sorted under magnification to enumerate infauna retained on a 250-μm-mesh sieve and with a maximum dimension >1 mm. Our results showed that floating mats provide habitat for a macroinvertebrate community with higher densities (no. animals/g ash-free dry mass) of Hyalella azteca, Dasyhelea spp., and Cladocera, and lower densities of Chironomidae and Planorbella spp. than communities associated with submerged epiphyton. Densities of the most common taxa increased 3× to 15× from early to late wet season, and community differences between the 2 habitat types became more pronounced. Floating-mat coverage and estimated floating-mat biomass increased 20 to 30% and 30 to 110%, respectively, at most sites in the late wet season. Some intersite variation was observed in individual taxa, but no consistent spatial pattern in any taxon was detected at any scale (from 0.2 m to 3 km). Floating mats and their resident macroinvertebrate communities are important components in the Everglades food web. This community should be included in environmental monitoring programs because degradation and eventual loss of the calcareous periphyton mat is associated with P enrichment in this ecosystem.
Resumo:
The frequency of extreme environmental events is predicted to increase in the future. Understanding the short- and long-term impacts of these extreme events on large-bodied predators will provide insight into the spatial and temporal scales at which acute environmental disturbances in top-down processes may persist within and across ecosystems. Here, we use long-term studies of movements and age structure of an estuarine top predator—juvenile bull sharks Carcharhinus leucas—to identify the effects of an extreme ‘cold snap’ from 2 to 13 January 2010 over short (weeks) to intermediate (months) time scales. Juvenile bull sharks are typically year-round residents of the Shark River Estuary until they reach 3 to 5 yr of age. However, acoustic telemetry revealed that almost all sharks either permanently left the system or died during the cold snap. For 116 d after the cold snap, no sharks were detected in the system with telemetry or captured during longline sampling. Once sharks returned, both the size structure and abundance of the individuals present in the nursery had changed considerably. During 2010, individual longlines were 70% less likely to capture any sharks, and catch rates on successful longlines were 40% lower than during 2006−2009. Also, all sharks caught after the cold snap were young-of-the-year or neonates, suggesting that the majority of sharks in the estuary were new recruits and several cohorts had been largely lost from the nursery. The longer-term impacts of this change in bull shark abundance to the trophic dynamics of the estuary and the importance of episodic disturbances to bull shark population dynamics will require continued monitoring, but are of considerable interest because of the ecological roles of bull sharks within coastal estuaries and oceans.
Resumo:
The field emission measurements for the multistage structured nanotubes (i.e., thin-multiwall and single wall carbon nanotubes grown on multiwall carbon nanotubes) were carried out and a low turn-on field of ~0.45 V/ μm, high emission current of 450 μA at a field of IV/μm and a large field enhancement factor of ~26200 were obtained. The thin multiwall carbon nanotubes (thin-MWNTs) and single wall carbon nanotubes (SWNTs) were grown on the regular arrays of vertically aligned multi wall carbon nanotubes (MWNTs) on porous silicon substrate by Chemical Vapor Deposition (CVD) method. The thin-MWNTs and SWNTs grown on MWNTs in this way have a multistage structure which gives higher enhancement of the electric field and hence the electron field emission.