865 resultados para multi-environments experiments


Relevância:

30.00% 30.00%

Publicador:

Resumo:

While the robots gradually become a part of our daily lives, they already play vital roles in many critical operations. Some of these critical tasks include surgeries, battlefield operations, and tasks that take place in hazardous environments or distant locations such as space missions. ^ In most of these tasks, remotely controlled robots are used instead of autonomous robots. This special area of robotics is called teleoperation. Teleoperation systems must be reliable when used in critical tasks; hence, all of the subsystems must be dependable even under a subsystem or communication line failure. ^ These systems are categorized as unilateral or bilateral teleoperation. A special type of bilateral teleoperation is described as force-reflecting teleoperation, which is further investigated as limited- and unlimited-workspace teleoperation. ^ Teleoperation systems configured in this study are tested both in numerical simulations and experiments. A new method, Virtual Rapid Robot Prototyping, is introduced to create system models rapidly and accurately. This method is then extended to configure experimental setups with actual master systems working with system models of the slave robots accompanied with virtual reality screens as well as the actual slaves. Fault-tolerant design and modeling of the master and slave systems are also addressed at different levels to prevent subsystem failure. ^ Teleoperation controllers are designed to compensate for instabilities due to communication time delays. Modifications to the existing controllers are proposed to configure a controller that is reliable in communication line failures. Position/force controllers are also introduced for master and/or slave robots. Later, controller architecture changes are discussed in order to make these controllers dependable even in systems experiencing communication problems. ^ The customary and proposed controllers for teleoperation systems are tested in numerical simulations on single- and multi-DOF teleoperation systems. Experimental studies are then conducted on seven different systems that included limited- and unlimited-workspace teleoperation to verify and improve simulation studies. ^ Experiments of the proposed controllers were successful relative to the customary controllers. Overall, by employing the fault-tolerance features and the proposed controllers, a more reliable teleoperation system is possible to design and configure which allows these systems to be used in a wider range of critical missions. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Internet has become an integral part of our nation’s critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a ‘distance metric’. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Freshwater ecosystems have been recognized as important components of the global carbon cycle, and the flux of organic matter (OM) from freshwater to marine environments can significantly affect estuarine and coastal productivity. The focus of this study was the assessment of carbon dynamics in two aquatic environments, namely the Florida Everglades and small prairie streams in Kansas, with the aim of characterizing the biogeochemistry of OM. In the Everglades, particulate OM (POM) is mostly found as a layer of flocculent material (floc). While floc is believed to be the main energy source driving trophic dynamics in this oligotrophic wetland, not much is known about its biogeochemistry. The objective of this study was to determine the origin/sources of OM in floc using biomarkers and pigment-based chemotaxonomy to assess specific biomass contributions to this material, on a spatial (freshwater marshes vs. mangrove fringe) and seasonal (wet vs. dry) scales. It was found that floc OM is derived from the local vegetation (mainly algal components and macrophyte litter) and its composition is controlled by seasonal drivers of hydrology and local biomass productivity. Photo-reactivity experiments showed that light exposure on floc resulted in photo-dissolution of POC with the generation of significant amounts of both dissolved OM (DOM) and nutrients (N & P), potentially influencing nutrient dynamics in this ecosystem. The bio-reactivity experiments determined as the amount and rate of CO2 evolution during incubation were found to vary on seasonal and spatial scales and were highly influenced by phosphorus limitation. Not much is known on OM dynamics in small headwater streams. The objective of this study was to determine carbon dynamics in sediments from intermittent prairie streams, characterized by different vegetation cover for their watershed (C4 grasses) vs. riparian zone (C3 plants). In this study sedimentary OM was characterized using a biomarker and compound specific carbon stable isotope approach. It was found that the biomarker composition of these sediments is dominated by higher plant inputs from the riparian zone, although inputs from adjacent prairie grasses were also apparent. Conflicting to some extent with the River Continuum Concept, sediments of the upper reaches contained more degraded OM, while the lower reaches were enriched in fresh material deriving from higher plants and plankton sources as a result of hydrological regimes and particle sorting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hydrologic modifications have negatively impacted the Florida Everglades in numerous significant ways. The compartmentalization of the once continuously flowing system into the Water Conservation Areas (WCAs) caused disruption of the slow natural flow of water south from Lake Okeechobee through the Everglades to Florida Bay. The ponding of water in the WCAs, the linking of water flow to controlled water levels, and the management of water levels for anthropogenic vs. ecological well-being has caused a reduction in the spatial heterogeneity of the Everglades leading to greater uniformity in topography and vegetation. These effects are noticeable as the degradation in structure of the Everglades Ridge and Slough environment and associated Tree Islands. In aquatic systems water flow is of fundamental importance in shaping the structure and function of the ecosystem. The organized patterns of parallel orientation of ridges, sloughs, and tear-drop shaped tree islands along historic flow paths attest to the importance of water movement in structuring this system. Our main objective was to operate and manage the LILA facility to provide a broad potential as a research platform for an integrated group of multidisciplinary, multi-agency scientists collaborating on multifunctional studies aimed primarily at determining the effects of CERP water management scenarios on the ecology of tree islands and ridge and slough habitats. We support Everglades water management, CERP, and the Long-Term Plan by defining hydrologic regimes that sustain healthy tree islands and ridge and slough ecosystems. Information gained through this project will help to reduce the uncertainty of predicting the tree island and ridge and slough ecosystem response to changes in hydrologic conditions. Additionally, we have developed the LILA site as a visual example of Everglades restoration programs in action.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While the robots gradually become a part of our daily lives, they already play vital roles in many critical operations. Some of these critical tasks include surgeries, battlefield operations, and tasks that take place in hazardous environments or distant locations such as space missions. In most of these tasks, remotely controlled robots are used instead of autonomous robots. This special area of robotics is called teleoperation. Teleoperation systems must be reliable when used in critical tasks; hence, all of the subsystems must be dependable even under a subsystem or communication line failure. These systems are categorized as unilateral or bilateral teleoperation. A special type of bilateral teleoperation is described as force-reflecting teleoperation, which is further investigated as limited- and unlimited-workspace teleoperation. Teleoperation systems configured in this study are tested both in numerical simulations and experiments. A new method, Virtual Rapid Robot Prototyping, is introduced to create system models rapidly and accurately. This method is then extended to configure experimental setups with actual master systems working with system models of the slave robots accompanied with virtual reality screens as well as the actual slaves. Fault-tolerant design and modeling of the master and slave systems are also addressed at different levels to prevent subsystem failure. Teleoperation controllers are designed to compensate for instabilities due to communication time delays. Modifications to the existing controllers are proposed to configure a controller that is reliable in communication line failures. Position/force controllers are also introduced for master and/or slave robots. Later, controller architecture changes are discussed in order to make these controllers dependable even in systems experiencing communication problems. The customary and proposed controllers for teleoperation systems are tested in numerical simulations on single- and multi-DOF teleoperation systems. Experimental studies are then conducted on seven different systems that included limited- and unlimited-workspace teleoperation to verify and improve simulation studies. Experiments of the proposed controllers were successful relative to the customary controllers. Overall, by employing the fault-tolerance features and the proposed controllers, a more reliable teleoperation system is possible to design and configure which allows these systems to be used in a wider range of critical missions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Internet has become an integral part of our nation's critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a 'distance metric'. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the introduction of new input devices, such as multi-touch surface displays, the Nintendo WiiMote, the Microsoft Kinect, and the Leap Motion sensor, among others, the field of Human-Computer Interaction (HCI) finds itself at an important crossroads that requires solving new challenges. Given the amount of three-dimensional (3D) data available today, 3D navigation plays an important role in 3D User Interfaces (3DUI). This dissertation deals with multi-touch, 3D navigation, and how users can explore 3D virtual worlds using a multi-touch, non-stereo, desktop display. ^ The contributions of this dissertation include a feature-extraction algorithm for multi-touch displays (FETOUCH), a multi-touch and gyroscope interaction technique (GyroTouch), a theoretical model for multi-touch interaction using high-level Petri Nets (PeNTa), an algorithm to resolve ambiguities in the multi-touch gesture classification process (Yield), a proposed technique for navigational experiments (FaNS), a proposed gesture (Hold-and-Roll), and an experiment prototype for 3D navigation (3DNav). The verification experiment for 3DNav was conducted with 30 human-subjects of both genders. The experiment used the 3DNav prototype to present a pseudo-universe, where each user was required to find five objects using the multi-touch display and five objects using a game controller (GamePad). For the multi-touch display, 3DNav used a commercial library called GestureWorks in conjunction with Yield to resolve the ambiguity posed by the multiplicity of gestures reported by the initial classification. The experiment compared both devices. The task completion time with multi-touch was slightly shorter, but the difference was not statistically significant. The design of experiment also included an equation that determined the level of video game console expertise of the subjects, which was used to break down users into two groups: casual users and experienced users. The study found that experienced gamers performed significantly faster with the GamePad than casual users. When looking at the groups separately, casual gamers performed significantly better using the multi-touch display, compared to the GamePad. Additional results are found in this dissertation.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A pilot scale multi-media filtration system was used to evaluate the effectiveness of filtration in removing petroleum hydrocarbons from a source water contaminated with diesel fuel. Source water was artificially prepared by mixing bentonite clay and tap water to produce a turbidity range of 10-15 NTU. Diesel fuel concentrations of 150 ppm or 750 ppm were used to contaminate the source water. The coagulants used included Cat Floc K-10 and Cat Floc T-2. The experimental phase was conducted under direct filtration conditions at constant head and constant rate filtration at 8.0 gpm. Filtration experiments were run until the filter reached its clogging point as noted by a measured peak pressure loss of 10 psi. The experimental variables include type of coagulant, oil concentration and source water. Filtration results were evaluated based on turbidity removal and petroleum hydrocarbon (PHC) removal efficiency as measured by gas chromatography. Experiments indicated that clogging was controlled by the clay loading on the filter and that inadequate destabilization of the contaminated water by the coagulant limited the PHC removal. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The exponential growth of studies on the biological response to ocean acidification over the last few decades has generated a large amount of data. To facilitate data comparison, a data compilation hosted at the data publisher PANGAEA was initiated in 2008 and is updated on a regular basis (doi:10.1594/PANGAEA.149999). By January 2015, a total of 581 data sets (over 4 000 000 data points) from 539 papers had been archived. Here we present the developments of this data compilation five years since its first description by Nisumaa et al. (2010). Most of study sites from which data archived are still in the Northern Hemisphere and the number of archived data from studies from the Southern Hemisphere and polar oceans are still relatively low. Data from 60 studies that investigated the response of a mix of organisms or natural communities were all added after 2010, indicating a welcomed shift from the study of individual organisms to communities and ecosystems. The initial imbalance of considerably more data archived on calcification and primary production than on other processes has improved. There is also a clear tendency towards more data archived from multifactorial studies after 2010. For easier and more effective access to ocean acidification data, the ocean acidification community is strongly encouraged to contribute to the data archiving effort, and help develop standard vocabularies describing the variables and define best practices for archiving ocean acidification data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Educational Data Mining is an application domain in artificial intelligence area that has been extensively explored nowadays. Technological advances and in particular, the increasing use of virtual learning environments have allowed the generation of considerable amounts of data to be investigated. Among the activities to be treated in this context exists the prediction of school performance of the students, which can be accomplished through the use of machine learning techniques. Such techniques may be used for student’s classification in predefined labels. One of the strategies to apply these techniques consists in their combination to design multi-classifier systems, which efficiency can be proven by results achieved in other studies conducted in several areas, such as medicine, commerce and biometrics. The data used in the experiments were obtained from the interactions between students in one of the most used virtual learning environments called Moodle. In this context, this paper presents the results of several experiments that include the use of specific multi-classifier systems systems, called ensembles, aiming to reach better results in school performance prediction that is, searching for highest accuracy percentage in the student’s classification. Therefore, this paper presents a significant exploration of educational data and it shows analyzes of relevant results about these experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Lena River Delta, situated in Northern Siberia (72.0 - 73.8° N, 122.0 - 129.5° E), is the largest Arctic delta and covers 29,000 km**2. Since natural deltas are characterised by complex geomorphological patterns and various types of ecosystems, high spatial resolution information on the distribution and extent of the delta environments is necessary for a spatial assessment and accurate quantification of biogeochemical processes as drivers for the emission of greenhouse gases from tundra soils. In this study, the first land cover classification for the entire Lena Delta based on Landsat 7 Enhanced Thematic Mapper (ETM+) images was conducted and used for the quantification of methane emissions from the delta ecosystems on the regional scale. The applied supervised minimum distance classification was very effective with the few ancillary data that were available for training site selection. Nine land cover classes of aquatic and terrestrial ecosystems in the wetland dominated (72%) Lena Delta could be defined by this classification approach. The mean daily methane emission of the entire Lena Delta was calculated with 10.35 mg CH4/m**2/d. Taking our multi-scale approach into account we find that the methane source strength of certain tundra wetland types is lower than calculated previously on coarser scales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coral reef maps at various spatial scales and extents are needed for mapping, monitoring, modelling, and management of these environments. High spatial resolution satellite imagery, pixel <10 m, integrated with field survey data and processed with various mapping approaches, can provide these maps. These approaches have been accurately applied to single reefs (10-100 km**2), covering one high spatial resolution scene from which a single thematic layer (e.g. benthic community) is mapped. This article demonstrates how a hierarchical mapping approach can be applied to coral reefs from individual reef to reef-system scales (10-1000 km**2) using object-based image classification of high spatial resolution images guided by ecological and geomorphological principles. The approach is demonstrated for three individual reefs (10-35 km**2) in Australia, Fiji, and Palau; and for three complex reef systems (300-600 km**2) one in the Solomon Islands and two in Fiji. Archived high spatial resolution images were pre-processed and mosaics were created for the reef systems. Georeferenced benthic photo transect surveys were used to acquire cover information. Field and image data were integrated using an object-based image analysis approach that resulted in a hierarchically structured classification. Objects were assigned class labels based on the dominant benthic cover type, or location-relevant ecological and geomorphological principles, or a combination thereof. This generated a hierarchical sequence of reef maps with an increasing complexity in benthic thematic information that included: 'reef', 'reef type', 'geomorphic zone', and 'benthic community'. The overall accuracy of the 'geomorphic zone' classification for each of the six study sites was 76-82% using 6-10 mapping categories. For 'benthic community' classification, the overall accuracy was 52-75% with individual reefs having 14-17 categories and reef systems 20-30 categories. We show that an object-based classification of high spatial resolution imagery, guided by field data and ecological and geomorphological principles, can produce consistent, accurate benthic maps at four hierarchical spatial scales for coral reefs of various sizes and complexities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work is to evaluate the SEE sensitivity of a multi-core processor having implemented ECC and parity in their cache memories. Two different application scenarios are studied. The first one configures the multi-core in Asymmetric Multi-Processing mode running a memory-bound application, whereas the second one uses the Symmetric Multi-Processsing mode running a CPU-bound application. The experiments were validated through radiation ground testing performed with 14 MeV neutrons on the Freescale P2041 multi-core manufactured in 45nm SOI technology. A deep analysis of the observed errors in cache memories was carried-out in order to reveal vulnerabilities in the cache protection mechanisms. Critical zones like tag addresses were affected during the experiments. In addition, the results show that the sensitivity strongly depends on the application and the multi-processsing mode used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Terrestrial ecosystems, occupying more than 25% of the Earth's surface, can serve as

`biological valves' in regulating the anthropogenic emissions of atmospheric aerosol

particles and greenhouse gases (GHGs) as responses to their surrounding environments.

While the signicance of quantifying the exchange rates of GHGs and atmospheric

aerosol particles between the terrestrial biosphere and the atmosphere is

hardly questioned in many scientic elds, the progress in improving model predictability,

data interpretation or the combination of the two remains impeded by

the lack of precise framework elucidating their dynamic transport processes over a

wide range of spatiotemporal scales. The diculty in developing prognostic modeling

tools to quantify the source or sink strength of these atmospheric substances

can be further magnied by the fact that the climate system is also sensitive to the

feedback from terrestrial ecosystems forming the so-called `feedback cycle'. Hence,

the emergent need is to reduce uncertainties when assessing this complex and dynamic

feedback cycle that is necessary to support the decisions of mitigation and

adaptation policies associated with human activities (e.g., anthropogenic emission

controls and land use managements) under current and future climate regimes.

With the goal to improve the predictions for the biosphere-atmosphere exchange

of biologically active gases and atmospheric aerosol particles, the main focus of this

dissertation is on revising and up-scaling the biotic and abiotic transport processes

from leaf to canopy scales. The validity of previous modeling studies in determining

iv

the exchange rate of gases and particles is evaluated with detailed descriptions of their

limitations. Mechanistic-based modeling approaches along with empirical studies

across dierent scales are employed to rene the mathematical descriptions of surface

conductance responsible for gas and particle exchanges as commonly adopted by all

operational models. Specically, how variation in horizontal leaf area density within

the vegetated medium, leaf size and leaf microroughness impact the aerodynamic attributes

and thereby the ultrane particle collection eciency at the leaf/branch scale

is explored using wind tunnel experiments with interpretations by a porous media

model and a scaling analysis. A multi-layered and size-resolved second-order closure

model combined with particle

uxes and concentration measurements within and

above a forest is used to explore the particle transport processes within the canopy

sub-layer and the partitioning of particle deposition onto canopy medium and forest

oor. For gases, a modeling framework accounting for the leaf-level boundary layer

eects on the stomatal pathway for gas exchange is proposed and combined with sap

ux measurements in a wind tunnel to assess how leaf-level transpiration varies with

increasing wind speed. How exogenous environmental conditions and endogenous

soil-root-stem-leaf hydraulic and eco-physiological properties impact the above- and

below-ground water dynamics in the soil-plant system and shape plant responses

to droughts is assessed by a porous media model that accommodates the transient

water

ow within the plant vascular system and is coupled with the aforementioned

leaf-level gas exchange model and soil-root interaction model. It should be noted

that tackling all aspects of potential issues causing uncertainties in forecasting the

feedback cycle between terrestrial ecosystem and the climate is unrealistic in a single

dissertation but further research questions and opportunities based on the foundation

derived from this dissertation are also brie

y discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the introduction of new input devices, such as multi-touch surface displays, the Nintendo WiiMote, the Microsoft Kinect, and the Leap Motion sensor, among others, the field of Human-Computer Interaction (HCI) finds itself at an important crossroads that requires solving new challenges. Given the amount of three-dimensional (3D) data available today, 3D navigation plays an important role in 3D User Interfaces (3DUI). This dissertation deals with multi-touch, 3D navigation, and how users can explore 3D virtual worlds using a multi-touch, non-stereo, desktop display. The contributions of this dissertation include a feature-extraction algorithm for multi-touch displays (FETOUCH), a multi-touch and gyroscope interaction technique (GyroTouch), a theoretical model for multi-touch interaction using high-level Petri Nets (PeNTa), an algorithm to resolve ambiguities in the multi-touch gesture classification process (Yield), a proposed technique for navigational experiments (FaNS), a proposed gesture (Hold-and-Roll), and an experiment prototype for 3D navigation (3DNav). The verification experiment for 3DNav was conducted with 30 human-subjects of both genders. The experiment used the 3DNav prototype to present a pseudo-universe, where each user was required to find five objects using the multi-touch display and five objects using a game controller (GamePad). For the multi-touch display, 3DNav used a commercial library called GestureWorks in conjunction with Yield to resolve the ambiguity posed by the multiplicity of gestures reported by the initial classification. The experiment compared both devices. The task completion time with multi-touch was slightly shorter, but the difference was not statistically significant. The design of experiment also included an equation that determined the level of video game console expertise of the subjects, which was used to break down users into two groups: casual users and experienced users. The study found that experienced gamers performed significantly faster with the GamePad than casual users. When looking at the groups separately, casual gamers performed significantly better using the multi-touch display, compared to the GamePad. Additional results are found in this dissertation.