825 resultados para Tracking and trailing.
Resumo:
1. Bee populations and other pollinators face multiple, synergistically acting threats, which have led to population declines, loss of local species richness and pollination services, and extinctions. However, our understanding of the degree, distribution and causes of declines is patchy, in part due to inadequate monitoring systems, with the challenge of taxonomic identification posing a major logistical barrier. Pollinator conservation would benefit from a high-throughput identification pipeline. 2. We show that the metagenomic mining and resequencing of mitochondrial genomes (mitogenomics) can be applied successfully to bulk samples of wild bees. We assembled the mitogenomes of 48 UK bee species and then shotgun-sequenced total DNA extracted from 204 whole bees that had been collected in 10 pan-trap samples from farms in England and been identified morphologically to 33 species. Each sample data set was mapped against the 48 reference mitogenomes. 3. The morphological and mitogenomic data sets were highly congruent. Out of 63 total species detections in the morphological data set, the mitogenomic data set made 59 correct detections (93�7% detection rate) and detected six more species (putative false positives). Direct inspection and an analysis with species-specific primers suggested that these putative false positives were most likely due to incorrect morphological IDs. Read frequency significantly predicted species biomass frequency (R2 = 24�9%). Species lists, biomass frequencies, extrapolated species richness and community structure were recovered with less error than in a metabarcoding pipeline. 4. Mitogenomics automates the onerous task of taxonomic identification, even for cryptic species, allowing the tracking of changes in species richness and istributions. A mitogenomic pipeline should thus be able to contain costs, maintain consistently high-quality data over long time series, incorporate retrospective taxonomic revisions and provide an auditable evidence trail. Mitogenomic data sets also provide estimates of species counts within samples and thus have potential for tracking population trajectories.
Resumo:
1. Species’ distributions are likely to be affected by a combination of environmental drivers. We used a data set of 11 million species occurrence records over the period 1970–2010 to assess changes in the frequency of occurrence of 673 macro-moth species in Great Britain. Groups of species with different predicted sensitivities showed divergent trends, which we interpret in the context of land-use and climatic changes. 2. A diversity of responses was revealed: 260 moth species declined significantly, whereas 160 increased significantly. Overall, frequencies of occurrence declined, mirroring trends in less species-rich, yet more intensively studied taxa. 3. Geographically widespread species, which were predicted to be more sensitive to land use than to climate change, declined significantly in southern Britain, where the cover of urban and arable land has increased. 4. Moths associated with low nitrogen and open environments (based on their larval host plant characteristics) declined most strongly, which is also consistent with a land-use change explanation. 5. Some moths that reach their northern (leading edge) range limit in southern Britain increased, whereas species restricted to northern Britain (trailing edge) declined significantly, consistent with a climate change explanation. 6. Not all species of a given type behaved similarly, suggesting that complex interactions between species’ attributes and different combinations of environmental drivers determine frequency of occurrence changes. 7. Synthesis and applications. Our findings are consistent with large-scale responses to climatic and land-use changes, with some species increasing and others decreasing. We suggest that land-use change (e.g. habitat loss, nitrogen deposition) and climate change are both major drivers of moth biodiversity change, acting independently and in combination. Importantly, the diverse responses revealed in this species-rich taxon show that multifaceted conservation strategies are needed to minimize negative biodiversity impacts of multiple environmental changes. We suggest that habitat protection, management and ecological restoration can mitigate combined impacts of land-use change and climate change by providing environments that are suitable for existing populations and also enable species to shift their ranges.
Resumo:
Background: Health care literature supports the development of accessible interventions that integrate behavioral economics, wearable devices, principles of evidence-based behavior change, and community support. However, there are limited real-world examples of large scale, population-based, member-driven reward platforms. Subsequently, a paucity of outcome data exists and health economic effects remain largely theoretical. To complicate matters, an emerging area of research is defining the role of Superusers, the small percentage of unusually engaged digital health participants who may influence other members. Objective: The objective of this preliminary study is to analyze descriptive data from GOODcoins, a self-guided, free-to-consumer engagement and rewards platform incentivizing walking, running and cycling. Registered members accessed the GOODcoins platform through PCs, tablets or mobile devices, and had the opportunity to sync wearables to track activity. Following registration, members were encouraged to join gamified group challenges and compare their progress with that of others. As members met challenge targets, they were rewarded with GOODcoins, which could be redeemed for planet- or people-friendly products. Methods: Outcome data were obtained from the GOODcoins custom SQL database. The reporting period was December 1, 2014 to May 1, 2015. Descriptive self-report data were analyzed using MySQL and MS Excel. Results: The study period includes data from 1298 users who were connected to an exercise tracking device. Females consisted of 52.6% (n=683) of the study population, 33.7% (n=438) were between the ages of 20-29, and 24.8% (n=322) were between the ages of 30-39. 77.5% (n=1006) of connected and active members met daily-recommended physical activity guidelines of 30 minutes, with a total daily average activity of 107 minutes (95% CI 90, 124). Of all connected and active users, 96.1% (n=1248) listed walking as their primary activity. For members who exchanged GOODcoins, the mean balance was 4,000 (95% CI 3850, 4150) at time of redemption, and 50.4% (n=61) of exchanges were for fitness or outdoor products, while 4.1% (n=5) were for food-related items. Participants were most likely to complete challenges when rewards were between 201-300 GOODcoins. Conclusions: The purpose of this study is to form a baseline for future research. Overall, results indicate that challenges and incentives may be effective for connected and active members, and may play a role in achieving daily-recommended activity guidelines. Registrants were typically younger, walking was the primary activity, and rewards were mainly exchanged for fitness or outdoor products. Remaining to be determined is whether members were already physically active at time of registration and are representative of healthy adherers, or were previously inactive and were incentivized to change their behavior. As challenges are gamified, there is an opportunity to investigate the role of superusers and healthy adherers, impacts on behavioral norms, and how cooperative games and incentives can be leveraged across stratified populations. Study limitations and future research agendas are discussed.
Resumo:
Retrograde transport of NF-κB from the synapse to the nucleus in neurons is mediated by the dynein/dynactin motor complex and can be triggered by synaptic activation. The calibre of axons is highly variable ranging down to 100 nm, aggravating the investigation of transport processes in neurites of living neurons using conventional light microscopy. In this study we quantified for the first time the transport of the NF-κB subunit p65 using high-density single-particle tracking in combination with photoactivatable fluorescent proteins in living mouse hippocampal neurons. We detected an increase of the mean diffusion coefficient (Dmean) in neurites from 0.12 ± 0.05 µm2/s to 0.61 ± 0.03 µm2/s after stimulation with glutamate. We further observed that the relative amount of retrogradely transported p65 molecules is increased after stimulation. Glutamate treatment resulted in an increase of the mean retrograde velocity from 10.9 ± 1.9 to 15 ± 4.9 µm/s, whereas a velocity increase from 9 ± 1.3 to 14 ± 3 µm/s was observed for anterogradely transported p65. This study demonstrates for the first time that glutamate stimulation leads to an increased mobility of single NF-κB p65 molecules in neurites of living hippocampal neurons.
Resumo:
The challenge of moving past the classic Window Icons Menus Pointer (WIMP) interface, i.e. by turning it ‘3D’, has resulted in much research and development. To evaluate the impact of 3D on the ‘finding a target picture in a folder’ task, we built a 3D WIMP interface that allowed the systematic manipulation of visual depth, visual aides, semantic category distribution of targets versus non-targets; and the detailed measurement of lower-level stimuli features. Across two separate experiments, one large sample web-based experiment, to understand associations, and one controlled lab environment, using eye tracking to understand user focus, we investigated how visual depth, use of visual aides, use of semantic categories, and lower-level stimuli features (i.e. contrast, colour and luminance) impact how successfully participants are able to search for, and detect, the target image. Moreover in the lab-based experiment, we captured pupillometry measurements to allow consideration of the influence of increasing cognitive load as a result of either an increasing number of items on the screen, or due to the inclusion of visual depth. Our findings showed that increasing the visible layers of depth, and inclusion of converging lines, did not impact target detection times, errors, or failure rates. Low-level features, including colour, luminance, and number of edges, did correlate with differences in target detection times, errors, and failure rates. Our results also revealed that semantic sorting algorithms significantly decreased target detection times. Increased semantic contrasts between a target and its neighbours correlated with an increase in detection errors. Finally, pupillometric data did not provide evidence of any correlation between the number of visible layers of depth and pupil size, however, using structural equation modelling, we demonstrated that cognitive load does influence detection failure rates when there is luminance contrasts between the target and its surrounding neighbours. Results suggest that WIMP interaction designers should consider stimulus-driven factors, which were shown to influence the efficiency with which a target icon can be found in a 3D WIMP interface.
Resumo:
The impact of the inter-El Nio (EN) variability on the moisture availability over Southeastern South America (SESA) is investigated. Also, an automatic tracking scheme was used to analyze the extratropical cyclones properties (system density - SD and central pressure - CP) in this region. During the austral summer period from 1977-2000, the differences for the upper-level wave train anomaly composites seem to determine the rainfall composite differences. In fact, the positive rainfall anomalies over most of the SESA domain during the strong EN events are explained by an upper-level cyclonic center over the tropics and an anticyclonic center over the eastern subtropical area. This pattern seems to contribute to upward vertical motion at 500 hPa and reinforcement of the meridional moisture transport from the equatorial Atlantic Ocean and western Amazon basin to the SESA region. These features may contribute to the positive SD and negative CP anomalies explaining part of the positive rainfall anomalies found there. On the other hand, negative rainfall anomalies are located in the northern part of SESA for the weak EN years when compared to those for the strong events. Also, positive anomalies are found in the southern part, albeit less intense. It was associated with the weakening of the meridional moisture transport from the tropics to the SESA that seems have to contributed with smaller SD and CP anomalies over the most part of subtropics, when compared to the strong EN years.
Resumo:
The representation of interfaces by means of the algebraic moving-least-squares (AMLS) technique is addressed. This technique, in which the interface is represented by an unconnected set of points, is interesting for evolving fluid interfaces since there is]to surface connectivity. The position of the surface points can thus be updated without concerns about the quality of any surface triangulation. We introduce a novel AMLS technique especially designed for evolving-interfaces applications that we denote RAMLS (for Robust AMLS). The main advantages with respect to previous AMLS techniques are: increased robustness, computational efficiency, and being free of user-tuned parameters. Further, we propose a new front-tracking method based on the Lagrangian advection of the unconnected point set that defines the RAMLS surface. We assume that a background Eulerian grid is defined with some grid spacing h. The advection of the point set makes the surface evolve in time. The point cloud can be regenerated at any time (in particular, we regenerate it each time step) by intersecting the gridlines with the evolved surface, which guarantees that the density of points on the surface is always well balanced. The intersection algorithm is essentially a ray-tracing algorithm, well-studied in computer graphics, in which a line (ray) is traced so as to detect all intersections with a surface. Also, the tracing of each gridline is independent and can thus be performed in parallel. Several tests are reported assessing first the accuracy of the proposed RAMLS technique, and then of the front-tracking method based on it. Comparison with previous Eulerian, Lagrangian and hybrid techniques encourage further development of the proposed method for fluid mechanics applications. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
Purpose - The purpose of this paper is to develop a novel unstructured simulation approach for injection molding processes described by the Hele-Shaw model. Design/methodology/approach - The scheme involves dual dynamic meshes with active and inactive cells determined from an initial background pointset. The quasi-static pressure solution in each timestep for this evolving unstructured mesh system is approximated using a control volume finite element method formulation coupled to a corresponding modified volume of fluid method. The flow is considered to be isothermal and non-Newtonian. Findings - Supporting numerical tests and performance studies for polystyrene described by Carreau, Cross, Ellis and Power-law fluid models are conducted. Results for the present method are shown to be comparable to those from other methods for both Newtonian fluid and polystyrene fluid injected in different mold geometries. Research limitations/implications - With respect to the methodology, the background pointset infers a mesh that is dynamically reconstructed here, and there are a number of efficiency issues and improvements that would be relevant to industrial applications. For instance, one can use the pointset to construct special bases and invoke a so-called ""meshless"" scheme using the basis. This would require some interesting strategies to deal with the dynamic point enrichment of the moving front that could benefit from the present front treatment strategy. There are also issues related to mass conservation and fill-time errors that might be addressed by introducing suitable projections. The general question of ""rate of convergence"" of these schemes requires analysis. Numerical results here suggest first-order accuracy and are consistent with the approximations made, but theoretical results are not available yet for these methods. Originality/value - This novel unstructured simulation approach involves dual meshes with active and inactive cells determined from an initial background pointset: local active dual patches are constructed ""on-the-fly"" for each ""active point"" to form a dynamic virtual mesh of active elements that evolves with the moving interface.
Resumo:
The Main Injector Neutrino Oscillation Search (MINOS) experiment uses an accelerator-produced neutrino beam to perform precision measurements of the neutrino oscillation parameters in the ""atmospheric neutrino"" sector associated with muon neutrino disappearance. This long-baseline experiment measures neutrino interactions in Fermilab`s NuMI neutrino beam with a near detector at Fermilab and again 735 km downstream with a far detector in the Soudan Underground Laboratory in northern Minnesota. The two detectors are magnetized steel-scintillator tracking calorimeters. They are designed to be as similar as possible in order to ensure that differences in detector response have minimal impact on the comparisons of event rates, energy spectra and topologies that are essential to MINOS measurements of oscillation parameters. The design, construction, calibration and performance of the far and near detectors are described in this paper. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
We present a variable time step, fully adaptive in space, hybrid method for the accurate simulation of incompressible two-phase flows in the presence of surface tension in two dimensions. The method is based on the hybrid level set/front-tracking approach proposed in [H. D. Ceniceros and A. M. Roma, J. Comput. Phys., 205, 391400, 2005]. Geometric, interfacial quantities are computed from front-tracking via the immersed-boundary setting while the signed distance (level set) function, which is evaluated fast and to machine precision, is used as a fluid indicator. The surface tension force is obtained by employing the mixed Eulerian/Lagrangian representation introduced in [S. Shin, S. I. Abdel-Khalik, V. Daru and D. Juric, J. Comput. Phys., 203, 493-516, 2005] whose success for greatly reducing parasitic currents has been demonstrated. The use of our accurate fluid indicator together with effective Lagrangian marker control enhance this parasitic current reduction by several orders of magnitude. To resolve accurately and efficiently sharp gradients and salient flow features we employ dynamic, adaptive mesh refinements. This spatial adaption is used in concert with a dynamic control of the distribution of the Lagrangian nodes along the fluid interface and a variable time step, linearly implicit time integration scheme. We present numerical examples designed to test the capabilities and performance of the proposed approach as well as three applications: the long-time evolution of a fluid interface undergoing Rayleigh-Taylor instability, an example of bubble ascending dynamics, and a drop impacting on a free interface whose dynamics we compare with both existing numerical and experimental data.
Resumo:
I consider the case for genuinely anonymous web searching. Big data seems to have it in for privacy. The story is well known, particularly since the dawn of the web. Vastly more personal information, monumental and quotidian, is gathered than in the pre-digital days. Once gathered it can be aggregated and analyzed to produce rich portraits, which in turn permit unnerving prediction of our future behavior. The new information can then be shared widely, limiting prospects and threatening autonomy. How should we respond? Following Nissenbaum (2011) and Brunton and Nissenbaum (2011 and 2013), I will argue that the proposed solutions—consent, anonymity as conventionally practiced, corporate best practices, and law—fail to protect us against routine surveillance of our online behavior. Brunton and Nissenbaum rightly maintain that, given the power imbalance between data holders and data subjects, obfuscation of one’s online activities is justified. Obfuscation works by generating “misleading, false, or ambiguous data with the intention of confusing an adversary or simply adding to the time or cost of separating good data from bad,” thus decreasing the value of the data collected (Brunton and Nissenbaum, 2011). The phenomenon is as old as the hills. Natural selection evidently blundered upon the tactic long ago. Take a savory butterfly whose markings mimic those of a toxic cousin. From the point of view of a would-be predator the data conveyed by the pattern is ambiguous. Is the bug lunch or potential last meal? In the light of the steep costs of a mistake, the savvy predator goes hungry. Online obfuscation works similarly, attempting for instance to disguise the surfer’s identity (Tor) or the nature of her queries (Howe and Nissenbaum 2009). Yet online obfuscation comes with significant social costs. First, it implies free riding. If I’ve installed an effective obfuscating program, I’m enjoying the benefits of an apparently free internet without paying the costs of surveillance, which are shifted entirely onto non-obfuscators. Second, it permits sketchy actors, from child pornographers to fraudsters, to operate with near impunity. Third, online merchants could plausibly claim that, when we shop online, surveillance is the price we pay for convenience. If we don’t like it, we should take our business to the local brick-and-mortar and pay with cash. Brunton and Nissenbaum have not fully addressed the last two costs. Nevertheless, I think the strict defender of online anonymity can meet these objections. Regarding the third, the future doesn’t bode well for offline shopping. Consider music and books. Intrepid shoppers can still find most of what they want in a book or record store. Soon, though, this will probably not be the case. And then there are those who, for perfectly good reasons, are sensitive about doing some of their shopping in person, perhaps because of their weight or sexual tastes. I argue that consumers should not have to pay the price of surveillance every time they want to buy that catchy new hit, that New York Times bestseller, or a sex toy.
Resumo:
Planning policies in several European countries have aimed at hindering the expansion of out-of-town shopping centers. One argument for this is concern for the increase in transport and a resulting increase in environmental externalities such as CO2-emissions. This concern is weakly founded in science as few studies have attempted to measure CO2-emissions of shopping trips as a function of the location of the shopping centers. In this paper we conduct a counter-factual analysis comparing downtown, edge-of-town and out-of-town shopping. In this comparison we use GPS to track 250 consumers over a time-span of two months in a Swedish region. The GPS-data enters the Oguchi’s formula to obtain shopping trip-specific CO2-emissions. We find that consumers’ out-of-town shopping would generate an excess of 60 per cent CO2-emissions whereas downtown and edge-of-town shopping centers are comparable.
Resumo:
Most previous studies have focused on entire trips in a geographic region, while a few of them addressed trips induced by a city landmark. Therefore paper explores trips and their CO2 emissions induced by a shopping center from a time-space perspective and their usage in relocation planning. This is conducted by the means of a case study in the city of Borlänge in mid-Sweden where trips to the city’s largest shopping mall in its center are examined. We use GPS tracking data of car trips that end and start at the shopping center. Thereafter, (1) we analyze the traffic emission patterns from a time-space perspective where temporal patterns reveal an hourly-based traffic emission dynamics and where spatial patterns uncover a heterogeneous distribution of traffic emissions in spatial areas and individual street segments. Further, (2) this study reports that most of the observed trips follow an optimal route in terms of CO2 emissions. In this respect, (3) we evaluate how well placed the current shopping center is through a comparison with two competing locations. We conclude that the two suggested locations, which are close to the current shopping center, do not show a significant improvement in term of CO2 emissions.
Resumo:
The advancement of GPS technology enables GPS devices not only to be used as orientation and navigation tools, but also to track travelled routes. GPS tracking data provides essential information for a broad range of urban planning applications such as transportation routing and planning, traffic management and environmental control. This paper describes on processing the data that was collected by tracking the cars of 316 volunteers over a seven-week period. The detailed information is extracted. The processed data is further connected to the underlying road network by means of maps. Geographical maps are applied to check how the car-movements match the road network. The maps capture the complexity of the car-movements in the urban area. The results show that 90% of the trips on the plane match the road network within a tolerance.
Resumo:
Transportation is seen as one of the major sources of CO2 pollutants nowadays. The impact of increased transport in retailing should not be underestimated. Most previous studies have focused on transportation and underlying trips, in general, while very few studies have addressed the specific affects that, for instance, intra-city shopping trips generate. Furthermore, most of the existing methods used to estimate emission are based on macro-data designed to generate national or regional inventory projections. There is a lack of studies using micro-data based methods that are able to distinguish between driver behaviour and the locational effects induced by shopping trips, which is an important precondition for energy efficient urban planning. The aim of this study is to implement a micro-data method to estimate and compare CO2 emission induced by intra-urban car travelling to a retail destination of durable goods (DG), and non-durable goods (NDG). We estimate the emissions from aspects of travel behaviour and store location. The study is conducted by means of a case study in the city of Borlänge, where GPS tracking data on intra-urban car travel is collected from 250 households. We find that a behavioural change during a trip towards a CO2 optimal travelling by car has the potential to decrease emission to 36% (DG), and to 25% (NDG) of the emissions induced by car-travelling shopping trips today. There is also a potential of reducing CO2 emissions induced by intra-urban shopping trips due to poor location by 54%, and if the consumer selected the closest of 8 existing stores, the CO2 emissions would be reduced by 37% of the current emission induced by NDG shopping trips.