4 resultados para location-dependent data query

em Bucknell University Digital Commons - Pensilvania - USA


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Meta-cognition, or "thinking about thinking," has been studied extensively in humans, but very little is known about the process in animals. Although great apes and rhesus macaques (Macaca mulatta) have demonstrated multiple apparently meta-cognitive abilities, other species have either been largely ignored or failed to convincingly display meta-cognitive traits. Recent work by Marsh, however, raised the possibility that some species may possess rudimentary or partial forms of meta-cognition. This thesis sought to further investigate this possibility by running multiple comparative experiments. The goal of the first study was to examine whether lion-tailed macaques, a species that may have a rudimentary form of meta-cognition, are able to use an uncertainty response adaptively, and if so, whether they could use the response flexibly when the stimuli for which the subjects should be uncertain changed. The macaques' acquisition of the initial discrimination task is ongoing, and as such there were not yet data to support a conclusion either way. In the second study, tufted capuchins were required to locate a food reward hidden beneath inverted cups that sat on a Plexiglas tray. In some conditions the capuchins were shown where the food was hidden, in others they could infer its location, and in yet others they were not given information about the location of the food. On all trials, however, capuchins could optionally seek additional information by looking up through the Plexiglas into the cups. In general, capuchins did this less often when they were shown the food reward, but not when they could infer the reward's location. These data suggest capuchins only meta-cognitively control their information seeking in some conditions, and thus, add support to the potential for a rudimentary form of meta-cognition. In convergence with other studies, these results may represent early models for rudimentary meta-cognition, although viable alternative explanations still remain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is the first part of a study investigating a model-based transient calibration process for diesel engines. The motivation is to populate hundreds of parameters (which can be calibrated) in a methodical and optimum manner by using model-based optimization in conjunction with the manual process so that, relative to the manual process used by itself, a significant improvement in transient emissions and fuel consumption and a sizable reduction in calibration time and test cell requirements is achieved. Empirical transient modelling and optimization has been addressed in the second part of this work, while the required data for model training and generalization are the focus of the current work. Transient and steady-state data from a turbocharged multicylinder diesel engine have been examined from a model training perspective. A single-cylinder engine with external air-handling has been used to expand the steady-state data to encompass transient parameter space. Based on comparative model performance and differences in the non-parametric space, primarily driven by a high engine difference between exhaust and intake manifold pressures (ΔP) during transients, it has been recommended that transient emission models should be trained with transient training data. It has been shown that electronic control module (ECM) estimates of transient charge flow and the exhaust gas recirculation (EGR) fraction cannot be accurate at the high engine ΔP frequently encountered during transient operation, and that such estimates do not account for cylinder-to-cylinder variation. The effects of high engine ΔP must therefore be incorporated empirically by using transient data generated from a spectrum of transient calibrations. Specific recommendations on how to choose such calibrations, how many data to acquire, and how to specify transient segments for data acquisition have been made. Methods to process transient data to account for transport delays and sensor lags have been developed. The processed data have then been visualized using statistical means to understand transient emission formation. Two modes of transient opacity formation have been observed and described. The first mode is driven by high engine ΔP and low fresh air flowrates, while the second mode is driven by high engine ΔP and high EGR flowrates. The EGR fraction is inaccurately estimated at both modes, while EGR distribution has been shown to be present but unaccounted for by the ECM. The two modes and associated phenomena are essential to understanding why transient emission models are calibration dependent and furthermore how to choose training data that will result in good model generalization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laurentide glaciation during the early Pleistocene (~970 ka) dammed the southeast-flowing West Branch of the Susquehanna River (WBSR), scouring bedrock and creating 100-km-long glacial Lake Lesley near the Great Bend at Muncy, Pennsylvania (Ramage et al., 1998). Local drill logs and well data indicate that subsequent paleo-outwash floods and modern fluvial processes have deposited as much as 30 meters of alluvium in this area, but little is known about the valley fill architecture and the bedrock-alluvium interface. By gaining a greater understanding of the bedrock-alluvium interface the project will not only supplement existing depth to bedrock information, but also provide information pertinent to the evolution of the Muncy Valley landscape. This project determined if variations in the thickness of the valley fill were detectable using micro-gravity techniques to map the bedrock-alluvium interface. The gravity method was deemed appropriate due to scale of the study area (~30 km2), ease of operation by a single person, and the available geophysical equipment. A LaCoste and Romberg Gravitron unit was used to collect gravitational field readings at 49 locations over 5 transects across the Muncy Creek and Susquehanna River valleys (approximately 30 km2), with at least two gravity base stations per transect. Precise latitude, longitude and ground surface elevation at each location were measured using an OPUS corrected Trimble RTK-GPS unit. Base stations were chosen based on ease of access due to the necessity of repeat measurements. Gravity measurement locations were selected and marked to provide easy access and repeat measurements. The gravimeter was returned to a base station within every two hours and a looping procedure was used to determine drift and maximize confidence in the gravity measurements. A two-minute calibration reading at each station was used to minimize any tares in the data. The Gravitron digitally recorded finite impulse response filtered gravity measurements every 20 seconds at each station. A measurement period of 15 minutes was used for each base station occupation and a minimum of 5 minutes at all other locations. Longer or multiple measurements were utilized at some sites if drift or other externalities (i.e. train or truck traffic) were effecting readings. Average, median, standard deviation and 95% confidence interval were calculated for each station. Tidal, drift, latitude, free-air, Bouguer and terrain corrections were then applied. The results show that the gravitational field decreases as alluvium thickness increases across the axes of the Susquehanna River and Muncy Creek valleys. However, the location of the gravity low does not correspond with the present-day location of the West Branch of the Susquehanna River (WBSR), suggesting that the WBSR may have been constrained along Bald Eagle Mountain by a glacial lobe originating from the Muncy Creek Valley to the northeast. Using a 3-D inversion model, the topography of the bedrock-alluvium interface was determined over the extent of the study area using a density contrast of -0.8 g/cm3. Our results are consistent with the bedrock geometry of the area, and provide a low-cost, non-invasive and efficient method for exploring the subsurface and for supplementing existing well data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The new knowledge environments of the digital age are oen described as places where we are all closely read, with our buying habits, location, and identities available to advertisers, online merchants, the government, and others through our use of the Internet. This is represented as a loss of privacy in which these entities learn about our activities and desires, using means that were unavailable in the pre-digital era. This article argues that the reciprocal nature of digital networks means 1) that the privacy issues that we face online are not radically different from those of the pre-Internet era, and 2) that we need to reconceive of close reading as an activity of which both humans and computer algorithms are capable.