62 resultados para Deployment depth
Resumo:
Owing to continuous advances in the computational power of handheld devices like smartphones and tablet computers, it has become possible to perform Big Data operations including modern data mining processes onboard these small devices. A decade of research has proved the feasibility of what has been termed as Mobile Data Mining, with a focus on one mobile device running data mining processes. However, it is not before 2010 until the authors of this book initiated the Pocket Data Mining (PDM) project exploiting the seamless communication among handheld devices performing data analysis tasks that were infeasible until recently. PDM is the process of collaboratively extracting knowledge from distributed data streams in a mobile computing environment. This book provides the reader with an in-depth treatment on this emerging area of research. Details of techniques used and thorough experimental studies are given. More importantly and exclusive to this book, the authors provide detailed practical guide on the deployment of PDM in the mobile environment. An important extension to the basic implementation of PDM dealing with concept drift is also reported. In the era of Big Data, potential applications of paramount importance offered by PDM in a variety of domains including security, business and telemedicine are discussed.
Resumo:
Recent experimental evidence suggests a finer genetic, structural and functional subdivision of the layers which form a cortical column. The classical layer II/III (LII/III) of rodent neocortex integrates ascending sensory information with contextual cortical information for behavioral read-out. We systematically investigated to which extent regular-spiking supragranular pyramidal neurons, located at different depths within the cortex, show different input-output connectivity patterns. Combining glutamate-uncaging with whole-cell recordings and biocytin filling, we revealed a novel cellular organization of LII/III: (i) “Lower LII/III” pyramidal cells receive a very strong excitatory input from lemniscal LIV and much fewer inputs from paralemniscal LVa. They project to all layers of the home column, including a feedback projection to LIV whereas transcolumnar projections are relatively sparse. (ii) “Upper LII/III” pyramidal cells also receive their strongest input from LIV, but in addition, a very strong and dense excitatory input from LVa. They project extensively to LII/III as well as LVa and Vb of their home and neighboring columns, (iii) “Middle LII/III” pyramidal cell show an intermediate connectivity phenotype that stands in many ways in-between the features described for lower versus upper LII/III. “Lower LII/III” intracolumnarly segregates and transcolumnarly integrates lemniscal information whereas “upper LII/III” seems to integrate lemniscal with paralemniscal information. This suggests a finegrained functional subdivision of the supragranular compartment containing multiple circuits without any obvious cytoarchitectonic, other structural or functional correlate of a laminar border in rodent barrel cortex.
Resumo:
Long Term Evolution based networks lack native support for Circuit Switched (CS) services. The Evolved Packet System (EPS) which includes the Evolved UMTS Terrestrial Radio Access Network (E-UTRAN) and Evolved Packet Core (EPC) is a purely all-IP packet system. This introduces the problem of how to provide voice call support when a user is within an LTE network and how to ensure voice service continuity when the user moves out of LTE coverage area. Different technologies have been proposed for the purpose of providing a voice to LTE users and to ensure the service continues outside LTE networks. The aim of this paper is to analyze and evaluate the overall performance of these technologies along with Single Radio Voice Call Continuity (SRVCC) Inter-RAT handover to Universal Terrestrial Radio Access Networks/ GSM-EDGE radio access Networks (UTRAN/GERAN). The possible solutions for providing voice call and service continuity over LTE-based networks are Circuit Switched Fall Back (CSFB), Voice over LTE via Generic Access (VoLGA), Voice over LTE (VoLTE) based on IMS/MMTel with SRVCC and Over The Top (OTT) services like Skype. This paper focuses mainly on the 3GPP standard solutions to implement voice over LTE. The paper compares various aspects of these solutions and suggests a possible roadmap that mobile operators can adopt to provide seamless voice over LTE.
Resumo:
Satellite data are increasingly used to provide observation-based estimates of the effects of aerosols on climate. The Aerosol-cci project, part of the European Space Agency's Climate Change Initiative (CCI), was designed to provide essential climate variables for aerosols from satellite data. Eight algorithms, developed for the retrieval of aerosol properties using data from AATSR (4), MERIS (3) and POLDER, were evaluated to determine their suitability for climate studies. The primary result from each of these algorithms is the aerosol optical depth (AOD) at several wavelengths, together with the Ångström exponent (AE) which describes the spectral variation of the AOD for a given wavelength pair. Other aerosol parameters which are possibly retrieved from satellite observations are not considered in this paper. The AOD and AE (AE only for Level 2) were evaluated against independent collocated observations from the ground-based AERONET sun photometer network and against “reference” satellite data provided by MODIS and MISR. Tools used for the evaluation were developed for daily products as produced by the retrieval with a spatial resolution of 10 × 10 km2 (Level 2) and daily or monthly aggregates (Level 3). These tools include statistics for L2 and L3 products compared with AERONET, as well as scoring based on spatial and temporal correlations. In this paper we describe their use in a round robin (RR) evaluation of four months of data, one month for each season in 2008. The amount of data was restricted to only four months because of the large effort made to improve the algorithms, and to evaluate the improvement and current status, before larger data sets will be processed. Evaluation criteria are discussed. Results presented show the current status of the European aerosol algorithms in comparison to both AERONET and MODIS and MISR data. The comparison leads to a preliminary conclusion that the scores are similar, including those for the references, but the coverage of AATSR needs to be enhanced and further improvements are possible for most algorithms. None of the algorithms, including the references, outperforms all others everywhere. AATSR data can be used for the retrieval of AOD and AE over land and ocean. PARASOL and one of the MERIS algorithms have been evaluated over ocean only and both algorithms provide good results.
Resumo:
A method has been developed to estimate aerosol optical depth (AOD) over land surfaces using high spatial resolution, hyperspectral, and multiangle Compact High Resolution Imaging Spectrometer (CHRIS)/Project for On Board Autonomy (PROBA) images. The CHRIS instrument is mounted aboard the PROBA satellite and provides up to 62 bands. The PROBA satellite allows pointing to obtain imagery from five different view angles within a short time interval. The method uses inversion of a coupled surface/atmosphere radiative transfer model and includes a general physical model of angular surface reflectance. An iterative process is used to determine the optimum value providing the best fit of the corrected reflectance values for a number of view angles and wavelengths with those provided by the physical model. This method has previously been demonstrated on data from the Advanced Along-Track Scanning Radiometer and is extended here to the spectral and angular sampling of CHRIS/PROBA. The values obtained from these observations are validated using ground-based sun-photometer measurements. Results from 22 image sets show an rms error of 0.11 in AOD at 550 nm, which is reduced to 0.06 after an automatic screening procedure.
Resumo:
Anticoagulants rodenticides have already known for over half a century, as effective and safe method of rodent control. However, discovered in 1958 anticoagulant resistance has given us a very important problem for their future long-term use. Laboratory tests provide the main method for identification the different types of anticoagulant resistances, quantify the magnitude of their effect and help us to choose the best pest control strategy. The main important tests are lethal feeding period (LFP) and blood clotting response (BCR) tests. These tests can now be used to quantify the likely effect of the resistance on treatment outcome by providing an estimate of the ‘resistance factor’. In 2004 the gene responsible for anticoagulant resistance (VKORC1) was identified and sequenced. As a result, a new molecular resistance testing methodology has been developed, and a number of resistance mutations, particularly in Norway rats and house mice. Three mutations of the VKORC1 gene in Norway rats have been identified to date that confer a degree of resistance to bromadiolone and difenacoum, sufficient to affect treatment outcome in the field.
Resumo:
The most popular endgame tables (EGTs) documenting ‘DTM’ Depth to Mate in chess endgames are those of Eugene Nalimov but these do not recognise the FIDE 50-move rule ‘50mr’. This paper marks the creation by the first author of EGTs for sub-6-man (s6m) chess and beyond which give DTM as affected by the ply count pc. The results are put into the context of previous work recognising the 50mr and are compared with the original unmoderated DTM results. The work is also notable for being the first EGT generation work to use the functional programming language HASKELL.
A benchmark-driven modelling approach for evaluating deployment choices on a multi-core architecture
Resumo:
The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.
Resumo:
The Clouds, Aerosol, and Precipitation in the Marine Boundary Layer (CAP-MBL) deployment at Graciosa Island in the Azores generated a 21-month (April 2009–December 2010) comprehensive dataset documenting clouds, aerosols, and precipitation using the Atmospheric Radiation Measurement Program (ARM) Mobile Facility (AMF). The scientific aim of the deployment is to gain improved understanding of the interactions of clouds, aerosols, and precipitation in the marine boundary layer. Graciosa Island straddles the boundary between the subtropics and midlatitudes in the northeast Atlantic Ocean and consequently experiences a great diversity of meteorological and cloudiness conditions. Low clouds are the dominant cloud type, with stratocumulus and cumulus occurring regularly. Approximately half of all clouds contained precipitation detectable as radar echoes below the cloud base. Radar and satellite observations show that clouds with tops from 1 to 11 km contribute more or less equally to surface-measured precipitation at Graciosa. A wide range of aerosol conditions was sampled during the deployment consistent with the diversity of sources as indicated by back-trajectory analysis. Preliminary findings suggest important two-way interactions between aerosols and clouds at Graciosa, with aerosols affecting light precipitation and cloud radiative properties while being controlled in part by precipitation scavenging. The data from Graciosa are being compared with short-range forecasts made with a variety of models. A pilot analysis with two climate and two weather forecast models shows that they reproduce the observed time-varying vertical structure of lower-tropospheric cloud fairly well but the cloud-nucleating aerosol concentrations less well. The Graciosa site has been chosen to be a permanent fixed ARM site that became operational in October 2013.
Resumo:
The long duration of the 2010 Eyjafjallajökull eruption provided a unique opportunity to measure a widely dispersed volcanic ash cloud. Layers of volcanic ash were observed by the European Aerosol Research Lidar Network with a mean depth of 1.2 km and standard deviation of 0.9 km. In this paper we evaluate the ability of the Met Office's Numerical Atmospheric-dispersion Modelling Environment (NAME) to simulate the observed ash layers and examine the processes controlling their depth. NAME simulates distal ash layer depths exceptionally well with a mean depth of 1.2 km and standard deviation of 0.7 km. The dominant process determining the depth of ash layers over Europe is the balance between the vertical wind shear (which acts to reduce the depth of the ash layers) and vertical turbulent mixing (which acts to deepen the layers). Interestingly, differential sedimentation of ash particles and the volcano vertical emission profile play relatively minor roles.
Resumo:
Uncertainty of Arctic seasonal to interannual predictions arising from model errors and initial state uncertainty has been widely discussed in the literature, whereas the irreducible forecast uncertainty (IFU) arising from the chaoticity of the climate system has received less attention. However, IFU provides important insights into the mechanisms through which predictability is lost, and hence can inform prioritization of model development and observations deployment. Here, we characterize how internal oceanic and surface atmospheric heat fluxes contribute to IFU of Arctic sea ice and upper ocean heat content in an Earth system model by analyzing a set of idealized ensemble prediction experiments. We find that atmospheric and oceanic heat flux are often equally important for driving unpredictable Arctic-wide changes in sea ice and surface water temperatures, and hence contribute equally to IFU. Atmospheric surface heat flux tends to dominate Arctic-wide changes for lead times of up to a year, whereas oceanic heat flux tends to dominate regionally and on interannual time scales. There is in general a strong negative covariance between surface heat flux and ocean vertical heat flux at depth, and anomalies of lateral ocean heat transport are wind-driven, which suggests that the unpredictable oceanic heat flux variability is mainly forced by the atmosphere. These results are qualitatively robust across different initial states, but substantial variations in the amplitude of IFU exist. We conclude that both atmospheric variability and the initial state of the upper ocean are key ingredients for predictions of Arctic surface climate on seasonal to interannual time scales.
Resumo:
Intercomparison and evaluation of the global ocean surface mixed layer depth (MLD) fields estimated from a suite of major ocean syntheses are conducted. Compared with the reference MLDs calculated from individual profiles, MLDs calculated from monthly mean and gridded profiles show negative biases of 10–20 m in early spring related to the re-stratification process of relatively deep mixed layers. Vertical resolution of profiles also influences the MLD estimation. MLDs are underestimated by approximately 5–7 (14–16) m with the vertical resolution of 25 (50) m when the criterion of potential density exceeding the 10-m value by 0.03 kg m−3 is used for the MLD estimation. Using the larger criterion (0.125 kg m−3) generally reduces the underestimations. In addition, positive biases greater than 100 m are found in wintertime subpolar regions when MLD criteria based on temperature are used. Biases of the reanalyses are due to both model errors and errors related to differences between the assimilation methods. The result shows that these errors are partially cancelled out through the ensemble averaging. Moreover, the bias in the ensemble mean field of the reanalyses is smaller than in the observation-only analyses. This is largely attributed to comparably higher resolutions of the reanalyses. The robust reproduction of both the seasonal cycle and interannual variability by the ensemble mean of the reanalyses indicates a great potential of the ensemble mean MLD field for investigating and monitoring upper ocean processes.
Resumo:
The importance of managing land to optimise carbon sequestration for climate change mitigation is widely recognised, with grasslands being identified as having the potential to sequester additional carbon. However, most soil carbon inventories only consider surface soils, and most large scale surveys group ecosystems into broad habitats without considering management intensity. Consequently, little is known about the quantity of deep soil carbon and its sensitivity to management. From a nationwide survey of grassland soils to 1 m depth, we show that carbon in grasslands soils is vulnerable to management and that these management effects can be detected to considerable depth down the soil profile, albeit at decreasing significance with depth. Carbon concentrations in soil decreased as management intensity increased, but greatest soil carbon stocks (accounting for bulk density differences), were at intermediate levels of management. Our study also highlights the considerable amounts of carbon in sub-surface soil below 30cm, which is missed by standard carbon inventories. We estimate grassland soil carbon in Great Britain to be 2097 Tg C to a depth of 1 m, with ~60% of this carbon being below 30cm. Total stocks of soil carbon (t ha-1) to 1 m depth were 10.7% greater at intermediate relative to intensive management, which equates to 10.1 t ha-1 in surface soils (0-30 cm), and 13.7 t ha-1 in soils from 30-100 cm depth. Our findings highlight the existence of substantial carbon stocks at depth in grassland soils that are sensitive to management. This is of high relevance globally, given the extent of land cover and large stocks of carbon held in temperate managed grasslands. Our findings have implications for the future management of grasslands for carbon storage and climate mitigation, and for global carbon models which do not currently account for changes in soil carbon to depth with management.
Resumo:
The role of the local atmospheric forcing on the ocean mixed layer depth (MLD) over the global oceans is studied using ocean reanalysis data products and a single-column ocean model coupled to an atmospheric general circulation model. The focus of this study is on how the annual mean and the seasonal cycle of the MLD relate to various forcing characteristics in different parts of the world's ocean, and how anomalous variations in the monthly mean MLD relate to anomalous atmospheric forcings. By analysing both ocean reanalysis data and the single-column ocean model, regions with different dominant forcings and different mean and variability characteristics of the MLD can be identified. Many of the global oceans' MLD characteristics appear to be directly linked to different atmospheric forcing characteristics at different locations. Here, heating and wind-stress are identified as the main drivers; in some, mostly coastal, regions the atmospheric salinity forcing also contributes. The annual mean MLD is more closely related to the annual mean wind-stress and the MLD seasonality is more closely to the seasonality in heating. The single-column ocean model, however, also points out that the MLD characteristics over most global ocean regions, and in particular the tropics and subtropics, cannot be maintained by local atmospheric forcings only, but are also a result of ocean dynamics that are not simulated in a single-column ocean model. Thus, lateral ocean dynamics are essentially in correctly simulating observed MLD.