953 resultados para Taxi GPS data
Resumo:
The Tara Oceans Expedition (2009-2013) sampled the world oceans on board a 36 m long schooner, collecting environmental data and organisms from viruses to planktonic metazoans for later analyses using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present dataset contains navigation and meteorological data measured during one campaign of the Tara Oceans Expedition. Latitude and Longitude were obtained from TSG data.
Resumo:
The metabolic rate of organisms may either be viewed as a basic property from which other vital rates and many ecological patterns emerge and that follows a universal allometric mass scaling law; or it may be considered a property of the organism that emerges as a result of the organism's adaptation to the environment, with consequently less universal mass scaling properties. Data on body mass, maximum ingestion and clearance rates, respiration rates and maximum growth rates of animals living in the ocean epipelagic were compiled from the literature, mainly from original papers but also from previous compilations by other authors. Data were read from tables or digitized from graphs. Only measurements made on individuals of know size, or groups of individuals of similar and known size were included. We show that clearance and respiration rates have life-form-dependent allometries that have similar scaling but different elevations, such that the mass-specific rates converge on a rather narrow size-independent range. In contrast, ingestion and growth rates follow a near-universal taxa-independent ~3/4 mass scaling power law. We argue that the declining mass-specific clearance rates with size within taxa is related to the inherent decrease in feeding efficiency of any particular feeding mode. The transitions between feeding mode and simultaneous transitions in clearance and respiration rates may then represent adaptations to the food environment and be the result of the optimization of tradeoffs that allow sufficient feeding and growth rates to balance mortality.
Resumo:
To identify the relationship between GPS scintillation in Natal-RN (Brazil) and geomagnetic disturbances of any intensities and variations, this work made analysis of the ionospheric behavior and magnetic indexes (Dst , AE and Bz of the interplanetary magnetic field) concerning to different periods of the solar cycle between 2000 and 2014. Part of the data of this research originated at the UFRN observatory, from a GEC Plessey board connected to an ANP -C 114 antenna, modified by Cornell University’s Space group Plasma Physics in order to operate the ScintMon, a GPS monitoring program. This study, therefore, found several cases of inhibited scintillations after the main phase of magnetic storms, a fact that, along with others, corroborated with categorization of Aarons (1991) and models of disturbed dynamo (according to Bonelli, 2008) and over-shielding penetration, defended by Kelley et al. (1979) and Abdu (2011) [4]. In addition to these findings, different morphologies were noted in such disruptions in the GPS signal in accordance with previous magnetic activities. It also found a moderate relationship (R2 = 0.52) between the Dst rate (concerning to specific time) and the average of S4 through a polynomial function. This finding therefore, corroborating Ilma et al. (2012) [17], is an important evidence that the scintillation GPS are not directly controlled by magnetic induction of storms. Completing this work, this relation did show itself as a way of partial predicting of scintillations.
Resumo:
Underwater photo-transect surveys were conducted on September 23-27, 2007 at different sections of the reef flat, reef crest and reef slope in Heron Reef. This survey was done by swimming along pre-defined transect sites and taking a picture of the bottom substrate parallel to the bottom at constant vertical distance (30cm) every two to three metres. A total of 3,586 benthic photos were taken. A floating GPS setup connected to the swimmer/diver by a line enabled recording of coordinates of transect surveys. Approximation of the coordinates for each benthic photo was based on the photo timestamp and GPS coordinate time stamp, using GPS Photo Link Software. Coordinates of each photo were interpolated by finding the the gps coordinates that were logged at a set time before and after the photo was captured. The output of this process was an ArcMap point shapefile, a Google Earth KML file and a thumbnail of each benthic photo taken. The data in the ArcMap shapefile and in the Google Earth KML file consisted of the approximated coordinate of each benthic photo taken during the survey. Using the GPS Photo Link extension within the ArcMap environment, opening the ArcMap shapefile will enable thumbnail to be displayed on the associated benthic cover photo whenever hovering with the mouse over a point on the transect. By downloading the GPSPhotoLink software from the www.geospatialexperts.com, and installing it as a trial version the ArcMap exstension will be installed in the ArcMap environment.
Resumo:
Underwater georeferenced photo-transect surveys were conducted on December 10-15, 2011 at various sections of the reef at Lizard Island, Great Barrier Reef. For this survey a snorkeler or diver swam over the bottom while taking photos of the benthos at a set height using a standard digital camera and towing a GPS in a surface float which logged the track every five seconds. A standard digital compact camera was placed in an underwater housing and fitted with a 16 mm lens which provided a 1.0 m x 1.0 m footprint, at 0.5 m height above the benthos. Horizontal distance between photos was estimated by three fin kicks of the survey diver/snorkeler, which corresponded to a surface distance of approximately 2.0 - 4.0 m. The GPS was placed in a dry-bag and logged the position as it floated at the surface while being towed by the photographer. A total of 5,735 benthic photos were taken. A floating GPS setup connected to the swimmer/diver by a line enabled recording of coordinates of each benthic photo (Roelfsema 2009). Approximation of coordinates of each benthic photo was conducted based on the photo timestamp and GPS coordinate time stamp, using GPS Photo Link Software (www.geospatialexperts.com). Coordinates of each photo were interpolated by finding the GPS coordinates that were logged at a set time before and after the photo was captured. Benthic or substrate cover data was derived from each photo by randomly placing 24 points over each image using the Coral Point Count for Microsoft Excel program (Kohler and Gill, 2006). Each point was then assigned to 1 of 78 cover types, which represented the benthic feature beneath it. Benthic cover composition summary of each photo scores was generated automatically using CPCE program. The resulting benthic cover data of each photo was linked to GPS coordinates, saved as an ArcMap point shapefile, and projected to Universal Transverse Mercator WGS84 Zone 55 South.
Resumo:
Underwater georeferenced photo-transect surveys were conducted on October 3-7, 2012 at various sections of the reef and lagoon at Lizard Island, Great Barrier Reef. For this survey a snorkeler swam while taking photos of the benthos at a set distance from the benthos using a standard digital camera and towing a GPS in a surface float which logged the track every five seconds. A Canon G12 digital camera was placed in a Canon underwater housing and photos were taken at 1 m height above the benthos. Horizontal distance between photos was estimated by three fin kicks of the survey snorkeler, which corresponded to a surface distance of approximately 2.0 - 4.0 m. The GPS was placed in a dry bag and logged the position at the surface while being towed by the photographer (Roelfsema, 2009). A total of 1,265 benthic photos were taken. Approximation of coordinates of each benthic photo was conducted based on the photo timestamp and GPS coordinate time stamp, using GPS Photo Link Software (www.geospatialexperts.com). Coordinates of each photo were interpolated by finding the GPS coordinates that were logged at a set time before and after the photo was captured. Benthic or substrate cover data was derived from each photo by randomly placing 24 points over each image using the Coral Point Count for Microsoft Excel program (Kohler and Gill, 2006). Each point was then assigned to 1 of 79 cover types, which represented the benthic feature beneath it. Benthic cover composition summary of each photo scores was generated automatically using CPCE program. The resulting benthic cover data of each photo was linked to GPS coordinates, saved as an ArcMap point shapefile, and projected to Universal Transverse Mercator WGS84 Zone 55 South.
Resumo:
An object based image analysis approach (OBIA) was used to create a habitat map of the Lizard Reef. Briefly, georeferenced dive and snorkel photo-transect surveys were conducted at different locations surrounding Lizard Island, Australia. For the surveys, a snorkeler or diver swam over the bottom at a depth of 1-2m in the lagoon, One Tree Beach and Research Station areas, and 7m depth in Watson's Bay, while taking photos of the benthos at a set height using a standard digital camera and towing a surface float GPS which was logging its track every five seconds. The camera lens provided a 1.0 m x 1.0 m footprint, at 0.5 m height above the benthos. Horizontal distance between photos was estimated by fin kicks, and corresponded to a surface distance of approximately 2.0 - 4.0 m. Approximation of coordinates of each benthic photo was done based on the photo timestamp and GPS coordinate time stamp, using GPS Photo Link Software (www.geospatialexperts.com). Coordinates of each photo were interpolated by finding the gps coordinates that were logged at a set time before and after the photo was captured. Dominant benthic or substrate cover type was assigned to each photo by placing 24 points random over each image using the Coral Point Count excel program (Kohler and Gill, 2006). Each point was then assigned a dominant cover type using a benthic cover type classification scheme containing nine first-level categories - seagrass high (>=70%), seagrass moderate (40-70%), seagrass low (<= 30%), coral, reef matrix, algae, rubble, rock and sand. Benthic cover composition summaries of each photo were generated automatically in CPCe. The resulting benthic cover data for each photo was linked to GPS coordinates, saved as an ArcMap point shapefile, and projected to Universal Transverse Mercator WGS84 Zone 56 South. The OBIA class assignment followed a hierarchical assignment based on membership rules with levels for "reef", "geomorphic zone" and "benthic community" (above).
Resumo:
Energy efficiency and user comfort have recently become priorities in the Facility Management (FM) sector. This has resulted in the use of innovative building components, such as thermal solar panels, heat pumps, etc., as they have potential to provide better performance, energy savings and increased user comfort. However, as the complexity of components increases, the requirement for maintenance management also increases. The standard routine for building maintenance is inspection which results in repairs or replacement when a fault is found. This routine leads to unnecessary inspections which have a cost with respect to downtime of a component and work hours. This research proposes an alternative routine: performing building maintenance at the point in time when the component is degrading and requires maintenance, thus reducing the frequency of unnecessary inspections. This thesis demonstrates that statistical techniques can be used as part of a maintenance management methodology to invoke maintenance before failure occurs. The proposed FM process is presented through a scenario utilising current Building Information Modelling (BIM) technology and innovative contractual and organisational models. This FM scenario supports a Degradation based Maintenance (DbM) scheduling methodology, implemented using two statistical techniques, Particle Filters (PFs) and Gaussian Processes (GPs). DbM consists of extracting and tracking a degradation metric for a component. Limits for the degradation metric are identified based on one of a number of proposed processes. These processes determine the limits based on the maturity of the historical information available. DbM is implemented for three case study components: a heat exchanger; a heat pump; and a set of bearings. The identified degradation points for each case study, from a PF, a GP and a hybrid (PF and GP combined) DbM implementation are assessed against known degradation points. The GP implementations are successful for all components. For the PF implementations, the results presented in this thesis find that the extracted metrics and limits identify degradation occurrences accurately for components which are in continuous operation. For components which have seasonal operational periods, the PF may wrongly identify degradation. The GP performs more robustly than the PF, but the PF, on average, results in fewer false positives. The hybrid implementations, which are a combination of GP and PF results, are successful for 2 of 3 case studies and are not affected by seasonal data. Overall, DbM is effectively applied for the three case study components. The accuracy of the implementations is dependant on the relationships modelled by the PF and GP, and on the type and quantity of data available. This novel maintenance process can improve equipment performance and reduce energy wastage from BSCs operation.