961 resultados para Large Marangoni Number
Resumo:
The seminal multiple view stereo benchmark evaluations from Middlebury and by Strecha et al. have played a major role in propelling the development of multi-view stereopsis methodology. Although seminal, these benchmark datasets are limited in scope with few reference scenes. Here, we try to take these works a step further by proposing a new multi-view stereo dataset, which is an order of magnitude larger in number of scenes and with a significant increase in diversity. Specifically, we propose a dataset containing 80 scenes of large variability. Each scene consists of 49 or 64 accurate camera positions and reference structured light scans, all acquired by a 6-axis industrial robot. To apply this dataset we propose an extension of the evaluation protocol from the Middlebury evaluation, reflecting the more complex geometry of some of our scenes. The proposed dataset is used to evaluate the state of the art multiview stereo algorithms of Tola et al., Campbell et al. and Furukawa et al. Hereby we demonstrate the usability of the dataset as well as gain insight into the workings and challenges of multi-view stereopsis. Through these experiments we empirically validate some of the central hypotheses of multi-view stereopsis, as well as determining and reaffirming some of the central challenges.
Resumo:
In the Light Controlled Factory part-to-part assembly and reduced weight will be enabled through the use of predictive fitting processes; low cost high accuracy reconfigurable tooling will be made possible by active compensation; improved control will allow accurate robotic machining; and quality will be improved through the use of traceable uncertainty based quality control throughout the production system. A number of challenges must be overcome before this vision will be realized; 1) controlling industrial robots for accurate machining; 2) compensation of measurements for thermal expansion; 3) Compensation of measurements for refractive index changes; 4) development of Embedded Metrology Tooling for in-tooling measurement and active tooling compensation; and 5) development of Software for the Planning and Control of Integrated Metrology Networks based on Quality Control with Uncertainty Evaluation and control systems for predictive processes. This paper describes how these challenges are being addressed, in particular the central challenge of developing large volume measurement process models within an integrated dimensional variation management (IDVM) system.
Resumo:
A wide range of metrology processes are involved in the manufacture of large products. In addition to the traditional tool-setting and product-verification operations, increasingly flexible metrology-enabled automation is also being used. Faced with many possible measurement problems and a very large number of metrology instruments employing diverse technologies, the selection of the appropriate instrument for a given task can be highly complex. Also, as metrology has become a key manufacturing process, it should be considered in the early stages of design, and there is currently very little research to support this. This paper provides an overview of the important selection criteria for typical measurement processes and presents some novel selection strategies. Metrics that can be used to assess measurability are also discussed. A prototype instrument selection and measurability analysis application is also presented, with discussion of how this can be used as the basis for development of a more sophisticated measurement planning tool. © 2010 Authors.
Resumo:
A partition of a positive integer n is a way of writing it as the sum of positive integers without regard to order; the summands are called parts. The number of partitions of n, usually denoted by p(n), is determined asymptotically by the famous partition formula of Hardy and Ramanujan [5]. We shall introduce the uniform probability measure P on the set of all partitions of n assuming that the probability 1/p(n) is assigned to each n-partition. The symbols E and V ar will be further used to denote the expectation and variance with respect to the measure P . Thus, each conceivable numerical characteristic of the parts in a partition can be regarded as a random variable.
Resumo:
2000 Mathematics Subject Classification: 78A50
Resumo:
The subject of dropout prevention/reduction is deservedly receiving attention as a problem that, if not resolved, could threaten our national future.^ This study investigates a small segment of the overall dropout problem, which has apparently unique features of program design and population selection. The evidence presented here should add to the knowledge bank of this complicated problem.^ Project Trio was one of a number of dropout prevention programs and activities which were conducted in Dade County school years 1984-85 and 1985-86, and it is here investigated longitudinally through the end of the 1987-88 school year. It involved 17 junior and senior high schools, and 27 programs, 10 the first year and 17 the second, with over 1,000 total students, who had been selected by the schools from a list of the "at risk" students provided by the district, and were divided approximately evenly into the classical research design of an experimental group and the control group, which following standard procedure was to take the regular school curriculum. No school had more than 25 students in either group.^ Each school modified the basic design of the project to accommodate the individual school characteristics and the perceived needs of their students; however all schools projects were to include some form of academic enhancement, counseling and career awareness study.^ The conclusion of this study was that the control group had a significantly lower dropout rate than the experimental group. Though impossible to make a certain determination of the reasons for this unexpected result, it appears from evidence presented that one cause may have been inadequate administration at the local level.^ This study was also a longitudinal investigation of the "at risk" population as a whole for the three and four year period, to determine if academic factors were present in records may be used to identify dropout proneness.^ A significant correlation was found between dropping out and various measures including scores on the Quality of School Life Instrument, attendance, grade point averages, mathematics grades, and overage in grade, important identifiers in selection for dropout prevention programs. ^
Resumo:
The single spin asymmetry, ALT ′, and the polarized structure function, σ LT′, for the p( e&ar; , e′K +)Λ reaction in the resonance region have been measured and extracted using the CEBAF Large Acceptance Spectrometer (CLAS) at Jefferson Lab. Data were taken at an electron beam energy of 2.567 GeV. The large acceptance of CLAS allows for full azimuthal angle coverage over a large range of center-of-mass scattering angles. Results were obtained that span a range in Q 2 from 0.5 to 1.3 GeV2 and W from threshold up to 2.1 GeV and were compared to existing theoretical calculations. The polarized structure function is sensitive to the interferences between various resonant amplitudes, as well as to resonant and non-resonant amplitudes. This measurement is essential for understanding the structure of nucleons and searching for previously undetected nucleon excited states (resonances) predicted by quark models. The W dependence of the σ LT′ in the kinematic regions dominated by s and u channel exchange (cos qcmk = −0.50, −0.167, 0.167) indicated possible resonance structures not predicted by theoretical calculations. The σLT ′ behavior around W = 1.875 GeV could be the signature of a resonance predicted by the quark models and possibly seen in photoproduction. In the very forward angles where the reaction is dominated by the t-channel, the average σLT ′ was zero. There was no indication of the interference between resonances or resonant and non-resonant amplitudes. This might be indicating the dominance of a single t-channel exchange. Study of the sensitivity of the fifth structure function data to the resonance around 1900 MeV showed that these data were highly sensitive to the various assumptions of the models for the quantum number of this resonance. This project was part of a larger CLAS program to measure cross sections and polarization observables for kaon electroproduction in the nucleon resonance region. ^
Resumo:
Anthropogenic habitat alterations and water-management practices have imposed an artificial spatial scale onto the once contiguous freshwater marshes of the Florida Everglades. To gain insight into how these changes may affect biotic communities, we examined whether variation in the abundance and community structure of large fishes (SL . 8 cm) in Everglades marshes varied more at regional or intraregional scales, and whether this variation was related to hydroperiod, water depth, floating mat volume, and vegetation density. From October 1997 to October 2002, we used an airboat electrofisher to sample large fishes at sites within three regions of the Everglades. Each of these regions is subject to unique watermanagement schedules. Dry-down events (water depth , 10 cm) occurred at several sites during spring in 1999, 2000, 2001, and 2002. The 2001 dry-down event was the most severe and widespread. Abundance of several fishes decreased significantly through time, and the number of days post-dry-down covaried significantly with abundance for several species. Processes operating at the regional scale appear to play important roles in regulating large fishes. The most pronounced patterns in abundance and community structure occurred at the regional scale, and the effect size for region was greater than the effect size for sites nested within region for abundance of all species combined, all predators combined, and each of the seven most abundant species. Non-metric multi-dimensional scaling revealed distinct groupings of sites corresponding to the three regions. We also found significant variation in community structure through time that correlated with the number of days post-dry-down. Our results suggest that hydroperiod and water management at the regional scale influence large fish communities of Everglades marshes.
Resumo:
Modern geographical databases, which are at the core of geographic information systems (GIS), store a rich set of aspatial attributes in addition to geographic data. Typically, aspatial information comes in textual and numeric format. Retrieving information constrained on spatial and aspatial data from geodatabases provides GIS users the ability to perform more interesting spatial analyses, and for applications to support composite location-aware searches; for example, in a real estate database: “Find the nearest homes for sale to my current location that have backyard and whose prices are between $50,000 and $80,000”. Efficient processing of such queries require combined indexing strategies of multiple types of data. Existing spatial query engines commonly apply a two-filter approach (spatial filter followed by nonspatial filter, or viceversa), which can incur large performance overheads. On the other hand, more recently, the amount of geolocation data has grown rapidly in databases due in part to advances in geolocation technologies (e.g., GPS-enabled smartphones) that allow users to associate location data to objects or events. The latter poses potential data ingestion challenges of large data volumes for practical GIS databases. In this dissertation, we first show how indexing spatial data with R-trees (a typical data pre-processing task) can be scaled in MapReduce—a widely-adopted parallel programming model for data intensive problems. The evaluation of our algorithms in a Hadoop cluster showed close to linear scalability in building R-tree indexes. Subsequently, we develop efficient algorithms for processing spatial queries with aspatial conditions. Novel techniques for simultaneously indexing spatial with textual and numeric data are developed to that end. Experimental evaluations with real-world, large spatial datasets measured query response times within the sub-second range for most cases, and up to a few seconds for a small number of cases, which is reasonable for interactive applications. Overall, the previous results show that the MapReduce parallel model is suitable for indexing tasks in spatial databases, and the adequate combination of spatial and aspatial attribute indexes can attain acceptable response times for interactive spatial queries with constraints on aspatial data.
Resumo:
A number of factors influence the information processing needs of organizations, particularly with respect to the coordination and control mechanisms within a hotel. The authors use a theoretical framework to illustrate alternative mechanisms that can be used to coordinate and control hotel operations.
Resumo:
Low-rise buildings are often subjected to high wind loads during hurricanes that lead to severe damage and cause water intrusion. It is therefore important to estimate accurate wind pressures for design purposes to reduce losses. Wind loads on low-rise buildings can differ significantly depending upon the laboratory in which they were measured. The differences are due in large part to inadequate simulations of the low-frequency content of atmospheric velocity fluctuations in the laboratory and to the small scale of the models used for the measurements. A new partial turbulence simulation methodology was developed for simulating the effect of low-frequency flow fluctuations on low-rise buildings more effectively from the point of view of testing accuracy and repeatability than is currently the case. The methodology was validated by comparing aerodynamic pressure data for building models obtained in the open-jet 12-Fan Wall of Wind (WOW) facility against their counterparts in a boundary-layer wind tunnel. Field measurements of pressures on Texas Tech University building and Silsoe building were also used for validation purposes. The tests in partial simulation are freed of integral length scale constraints, meaning that model length scales in such testing are only limited by blockage considerations. Thus the partial simulation methodology can be used to produce aerodynamic data for low-rise buildings by using large-scale models in wind tunnels and WOW-like facilities. This is a major advantage, because large-scale models allow for accurate modeling of architectural details, testing at higher Reynolds number, using greater spatial resolution of the pressure taps in high pressure zones, and assessing the performance of aerodynamic devices to reduce wind effects. The technique eliminates a major cause of discrepancies among measurements conducted in different laboratories and can help to standardize flow simulations for testing residential homes as well as significantly improving testing accuracy and repeatability. Partial turbulence simulation was used in the WOW to determine the performance of discontinuous perforated parapets in mitigating roof pressures. The comparisons of pressures with and without parapets showed significant reductions in pressure coefficients in the zones with high suctions. This demonstrated the potential of such aerodynamic add-on devices to reduce uplift forces.
Resumo:
During recent human history, human activities such as overhunting and habitat destruction have severely impacted many large top predator populations around the world. Studies from a variety of ecosystems show that loss or diminishment of top predator populations can have serious consequences for population and community dynamics and ecosystem stability. However, there are relatively few studies of the roles of large top predators in coastal ecosystems, so that we do not yet completely understand what could happen to coastal areas if large top predators are extirpated or significantly reduced in number. This lack of knowledge is surprising given that coastal areas around the globe are highly valued and densely populated by humans, and thus coastal large top predator populations frequently come into conflict with coastal human populations. This paper reviews what is known about the ecological roles of large top predators in coastal systems and presents a synthesis of recent work from three coastal eastern US Long Term Ecological Research (LTER) sites where long-term studies reveal what appear to be common themes relating to the roles of large top predators in coastal systems. We discuss three specific themes: (1) large top predators acting as mobile links between disparate habitats, (2) large top predators potentially affecting nutrient and biogeochemical dynamics through localized behaviors, and (3) individual specialization of large top predator behaviors. We also discuss how research within the LTER network has led to enhanced understanding of the ecological roles of coastal large top predators. Highlighting this work is intended to encourage further investigation of the roles of large top predators across diverse coastal aquatic habitats and to better inform researchers and ecosystem managers about the importance of large top predators for coastal ecosystem health and stability.
Resumo:
Long-span bridges are flexible and therefore are sensitive to wind induced effects. One way to improve the stability of long span bridges against flutter is to use cross-sections that involve twin side-by-side decks. However, this can amplify responses due to vortex induced oscillations. Wind tunnel testing is a well-established practice to evaluate the stability of bridges against wind loads. In order to study the response of the prototype in laboratory, dynamic similarity requirements should be satisfied. One of the parameters that is normally violated in wind tunnel testing is Reynolds number. In this dissertation, the effects of Reynolds number on the aerodynamics of a double deck bridge were evaluated by measuring fluctuating forces on a motionless sectional model of a bridge at different wind speeds representing different Reynolds regimes. Also, the efficacy of vortex mitigation devices was evaluated at different Reynolds number regimes. One other parameter that is frequently ignored in wind tunnel studies is the correct simulation of turbulence characteristics. Due to the difficulties in simulating flow with large turbulence length scale on a sectional model, wind tunnel tests are often performed in smooth flow as a conservative approach. The validity of simplifying assumptions in calculation of buffeting loads, as the direct impact of turbulence, needs to be verified for twin deck bridges. The effects of turbulence characteristics were investigated by testing sectional models of a twin deck bridge under two different turbulent flow conditions. Not only the flow properties play an important role on the aerodynamic response of the bridge, but also the geometry of the cross section shape is expected to have significant effects. In this dissertation, the effects of deck details, such as width of the gap between the twin decks, and traffic barriers on the aerodynamic characteristics of a twin deck bridge were investigated, particularly on the vortex shedding forces with the aim of clarifying how these shape details can alter the wind induced responses. Finally, a summary of the issues that are involved in designing a dynamic test rig for high Reynolds number tests is given, using the studied cross section as an example.
Resumo:
Large-extent vegetation datasets that co-occur with long-term hydrology data provide new ways to develop biologically meaningful hydrologic variables and to determine plant community responses to hydrology. We analyzed the suitability of different hydrological variables to predict vegetation in two water conservation areas (WCAs) in the Florida Everglades, USA, and developed metrics to define realized hydrologic optima and tolerances. Using vegetation data spatially co-located with long-term hydrological records, we evaluated seven variables describing water depth, hydroperiod length, and number of wet/dry events; each variable was tested for 2-, 4- and 10-year intervals for Julian annual averages and environmentally-defined hydrologic intervals. Maximum length and maximum water depth during the wet period calculated for environmentally-defined hydrologic intervals over a 4-year period were the best predictors of vegetation type. Proportional abundance of vegetation types along hydrological gradients indicated that communities had different realized optima and tolerances across WCAs. Although in both WCAs, the trees/shrubs class was on the drier/shallower end of hydrological gradients, while slough communities occupied the wetter/deeper end, the distribution ofCladium, Typha, wet prairie and Salix communities, which were intermediate for most hydrological variables, varied in proportional abundance along hydrologic gradients between WCAs, indicating that realized optima and tolerances are context-dependent.
Resumo:
Large amounts of the greenhouse gas methane are released from the seabed to the water column where it may be consumed by aerobic methanotrophic bacteria. This microbial filter is consequently the last marine sink for methane before its liberation to the atmosphere. The size and activity of methanotrophic communities, which determine the capacity of the water column methane filter, are thought to be mainly controlled by nutrient and redox dynamics, but little is known about the effects of ocean currents. Here, we report measurements of methanotrophic activity and biomass (CARD-FISH) at methane seeps west of Svalbard, and related them to physical water mass properties (CTD) and modelled current dynamics. We show that cold bottom water containing a large number of aerobic methanotrophs was rapidly displaced by warmer water with a considerably smaller methanotrophic community. This water mass exchange, caused by short-term variations of the West Spitsbergen Current, constitutes a rapid oceanographic switch severely reducing methanotrophic activity in the water column. Strong and fluctuating currents are widespread oceanographic features common at many methane seep systems and are thus likely to globally affect methane oxidation in the ocean water column.