941 resultados para slifetime-based garbage collection


Relevância:

30.00% 30.00%

Publicador:

Resumo:

With hundreds of millions of users reporting locations and embracing mobile technologies, Location Based Services (LBSs) are raising new challenges. In this dissertation, we address three emerging problems in location services, where geolocation data plays a central role. First, to handle the unprecedented growth of generated geolocation data, existing location services rely on geospatial database systems. However, their inability to leverage combined geographical and textual information in analytical queries (e.g. spatial similarity joins) remains an open problem. To address this, we introduce SpsJoin, a framework for computing spatial set-similarity joins. SpsJoin handles combined similarity queries that involve textual and spatial constraints simultaneously. LBSs use this system to tackle different types of problems, such as deduplication, geolocation enhancement and record linkage. We define the spatial set-similarity join problem in a general case and propose an algorithm for its efficient computation. Our solution utilizes parallel computing with MapReduce to handle scalability issues in large geospatial databases. Second, applications that use geolocation data are seldom concerned with ensuring the privacy of participating users. To motivate participation and address privacy concerns, we propose iSafe, a privacy preserving algorithm for computing safety snapshots of co-located mobile devices as well as geosocial network users. iSafe combines geolocation data extracted from crime datasets and geosocial networks such as Yelp. In order to enhance iSafe's ability to compute safety recommendations, even when crime information is incomplete or sparse, we need to identify relationships between Yelp venues and crime indices at their locations. To achieve this, we use SpsJoin on two datasets (Yelp venues and geolocated businesses) to find venues that have not been reviewed and to further compute the crime indices of their locations. Our results show a statistically significant dependence between location crime indices and Yelp features. Third, review centered LBSs (e.g., Yelp) are increasingly becoming targets of malicious campaigns that aim to bias the public image of represented businesses. Although Yelp actively attempts to detect and filter fraudulent reviews, our experiments showed that Yelp is still vulnerable. Fraudulent LBS information also impacts the ability of iSafe to provide correct safety values. We take steps toward addressing this problem by proposing SpiDeR, an algorithm that takes advantage of the richness of information available in Yelp to detect abnormal review patterns. We propose a fake venue detection solution that applies SpsJoin on Yelp and U.S. housing datasets. We validate the proposed solutions using ground truth data extracted by our experiments and reviews filtered by Yelp.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A prototype 3-dimensional (3D) anode, based on multiwall carbon nanotubes (MWCNTs), for Li-ion batteries (LIBs), with potential use in Electric Vehicles (EVs) was investigated. The unique 3D design of the anode allowed much higher areal mass density of MWCNTs as active materials, resulting in more amount of Li+ ion intake, compared to that of a conventional 2D counterpart. Furthermore, 3D amorphous Si/MWCNTs hybrid structure offered enhancement in electrochemical response (specific capacity 549 mAhg–1 ). Also, an anode stack was fabricated to further increase the areal or volumetric mass density of MWCNTs. An areal mass density of the anode stack 34.9 mg/cm2 was attained, which is 1,342% higher than the value for a single layer 2.6 mg/cm2. Furthermore, the binder-assisted and hot-pressed anode stack yielded the average reversible, stable gravimetric and volumetric specific capacities of 213 mAhg–1 and 265 mAh/cm3, respectively (at 0.5C). Moreover, a large-scale patterned novel flexible 3D MWCNTs-graphene-polyethylene terephthalate (PET) anode structure was prepared. It generated a reversible specific capacity of 153 mAhg–1 at 0.17C and cycling stability of 130 mAhg –1 up to 50 cycles at 1.7C.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brain is one of the safe sanctuaries for HIV and, in turn, continuously supplies active viruses to the periphery. Additionally, HIV infection in brain results in several mild-to-severe neuro-immunological complications termed neuroAIDS. One-tenth of HIV-infected population is addicted to recreational drugs such as opiates, alcohol, nicotine, marijuana, etc. which share common target-areas in the brain with HIV. Interestingly, intensity of neuropathogenesis is remarkably enhanced due to exposure of recreational drugs during HIV infection. Current treatments to alleviate either the individual or synergistic effects of abusive drugs and HIV on neuronal modulations are less effective at CNS level, basically due to impermeability of therapeutic molecules across blood-brain barrier (BBB). Despite exciting advancement of nanotechnology in drug delivery, existing nanovehicles such as dendrimers, polymers, micelles, etc. suffer from the lack of adequate BBB penetrability before the drugs are engulfed by the reticuloendothelial system cells as well as the uncertainty that if and when the nanocarrier reaches the brain. Therefore, in order to develop a fast, target-specific, safe, and effective approach for brain delivery of anti-addiction, anti-viral and neuroprotective drugs, we exploited the potential of magnetic nanoparticles (MNPs) which, in recent years, has attracted significant importance in biomedical applications. We hypothesize that under the influence of external (non-invasive) magnetic force, MNPs can deliver these drugs across BBB in most effective manner. Accordingly, in this dissertation, I delineated the pharmacokinetics and dynamics of MNPs bound anti-opioid, anti-HIV and neuroprotective drugs for delivery in brain. I have developed a liposome-based novel magnetized nanovehicle which, under the influence of external magnetic forces, can transmigrate and effectively deliver drugs across BBB without compromising its integrity. It is expected that the developed nanoformulations may be of high therapeutic significance for neuroAIDS and for drug addiction as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presences of heavy metals, organic contaminants and natural toxins in natural water bodies pose a serious threat to the environment and the health of living organisms. Therefore, there is a critical need to identify sustainable and environmentally friendly water treatment processes. In this dissertation, I focus on the fundamental studies of advanced oxidation processes and magnetic nano-materials as promising new technologies for water treatments. Advanced oxidation processes employ reactive oxygen species (ROS) which can lead to the mineralization of a number of pollutants and toxins. The rates of formation, steady-state concentrations, and kinetic parameters of hydroxyl radical and singlet oxygen produced by various TiO2 photocatalysts under UV or visible irradiations were measured using selective chemical probes. Hydroxyl radical is the dominant ROS, and its generation is dependent on experimental conditions. The optimal condition for generation of hydroxyl radical by of TiO2 coated glass microspheres is studied by response surface methodology, and the optimal conditions are applied for the degradation of dimethyl phthalate. Singlet oxygen (1O2) also plays an important role for advanced processes, so the degradation of microcystin-LR by rose bengal, an 1O2 sensitizer was studied. The measured bimolecular reaction rate constant between MC-LR and 1O2 is ∼ 106 M-1 s-1 based on competition kinetics with furfuryl alcohol. The typical adsorbent needs separation after the treatment, while magnetic iron oxides can be easily removed by a magnetic field. Maghemite and humic acid coated magnetite (HA-Fe3O4) were synthesized, characterized and applied for chromium(VI) removal. The adsorption of chromium(VI) by maghemite and HA-Fe3O4 follow a pseudo-second-order kinetic process. The adsorption of chromium(VI) by maghemite is accurately modeled using adsorption isotherms, and solution pH and presence of humic acid influence adsorption. Humic acid coated magnetite can adsorb and reduce chromium(VI) to non-toxic chromium (III), and the reaction is not highly dependent on solution pH. The functional groups associated with humic acid act as ligands lead to the Cr(III) complex via a coupled reduction-complexation mechanism. Extended X-ray absorption fine structure spectroscopy demonstrates the Cr(III) in the Cr-loaded HA-Fe 3O4 materials has six neighboring oxygen atoms in an octahedral geometry with average bond lengths of 1.98 Å.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Major portion of hurricane-induced economic loss originates from damages to building structures. The damages on building structures are typically grouped into three main categories: exterior, interior, and contents damage. Although the latter two types of damages, in most cases, cause more than 50% of the total loss, little has been done to investigate the physical damage process and unveil the interdependence of interior damage parameters. Building interior and contents damages are mainly due to wind-driven rain (WDR) intrusion through building envelope defects, breaches, and other functional openings. The limitation of research works and subsequent knowledge gaps, are in most part due to the complexity of damage phenomena during hurricanes and lack of established measurement methodologies to quantify rainwater intrusion. This dissertation focuses on devising methodologies for large-scale experimental simulation of tropical cyclone WDR and measurements of rainwater intrusion to acquire benchmark test-based data for the development of hurricane-induced building interior and contents damage model. Target WDR parameters derived from tropical cyclone rainfall data were used to simulate the WDR characteristics at the Wall of Wind (WOW) facility. The proposed WDR simulation methodology presents detailed procedures for selection of type and number of nozzles formulated based on tropical cyclone WDR study. The simulated WDR was later used to experimentally investigate the mechanisms of rainwater deposition/intrusion in buildings. Test-based dataset of two rainwater intrusion parameters that quantify the distribution of direct impinging raindrops and surface runoff rainwater over building surface — rain admittance factor (RAF) and surface runoff coefficient (SRC), respectively —were developed using common shapes of low-rise buildings. The dataset was applied to a newly formulated WDR estimation model to predict the volume of rainwater ingress through envelope openings such as wall and roof deck breaches and window sill cracks. The validation of the new model using experimental data indicated reasonable estimation of rainwater ingress through envelope defects and breaches during tropical cyclones. The WDR estimation model and experimental dataset of WDR parameters developed in this dissertation work can be used to enhance the prediction capabilities of existing interior damage models such as the Florida Public Hurricane Loss Model (FPHLM).^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The low-frequency electromagnetic compatibility (EMC) is an increasingly important aspect in the design of practical systems to ensure the functional safety and reliability of complex products. The opportunities for using numerical techniques to predict and analyze system's EMC are therefore of considerable interest in many industries. As the first phase of study, a proper model, including all the details of the component, was required. Therefore, the advances in EMC modeling were studied with classifying analytical and numerical models. The selected model was finite element (FE) modeling, coupled with the distributed network method, to generate the model of the converter's components and obtain the frequency behavioral model of the converter. The method has the ability to reveal the behavior of parasitic elements and higher resonances, which have critical impacts in studying EMI problems. For the EMC and signature studies of the machine drives, the equivalent source modeling was studied. Considering the details of the multi-machine environment, including actual models, some innovation in equivalent source modeling was performed to decrease the simulation time dramatically. Several models were designed in this study and the voltage current cube model and wire model have the best result. The GA-based PSO method is used as the optimization process. Superposition and suppression of the fields in coupling the components were also studied and verified. The simulation time of the equivalent model is 80-100 times lower than the detailed model. All tests were verified experimentally. As the application of EMC and signature study, the fault diagnosis and condition monitoring of an induction motor drive was developed using radiated fields. In addition to experimental tests, the 3DFE analysis was coupled with circuit-based software to implement the incipient fault cases. The identification was implemented using ANN for seventy various faulty cases. The simulation results were verified experimentally. Finally, the identification of the types of power components were implemented. The results show that it is possible to identify the type of components, as well as the faulty components, by comparing the amplitudes of their stray field harmonics. The identification using the stray fields is nondestructive and can be used for the setups that cannot go offline and be dismantled

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Kernel-level malware is one of the most dangerous threats to the security of users on the Internet, so there is an urgent need for its detection. The most popular detection approach is misuse-based detection. However, it cannot catch up with today's advanced malware that increasingly apply polymorphism and obfuscation. In this thesis, we present our integrity-based detection for kernel-level malware, which does not rely on the specific features of malware. ^ We have developed an integrity analysis system that can derive and monitor integrity properties for commodity operating systems kernels. In our system, we focus on two classes of integrity properties: data invariants and integrity of Kernel Queue (KQ) requests. ^ We adopt static analysis for data invariant detection and overcome several technical challenges: field-sensitivity, array-sensitivity, and pointer analysis. We identify data invariants that are critical to system runtime integrity from Linux kernel 2.4.32 and Windows Research Kernel (WRK) with very low false positive rate and very low false negative rate. We then develop an Invariant Monitor to guard these data invariants against real-world malware. In our experiment, we are able to use Invariant Monitor to detect ten real-world Linux rootkits and nine real-world Windows malware and one synthetic Windows malware. ^ We leverage static and dynamic analysis of kernel and device drivers to learn the legitimate KQ requests. Based on the learned KQ requests, we build KQguard to protect KQs. At runtime, KQguard rejects all the unknown KQ requests that cannot be validated. We apply KQguard on WRK and Linux kernel, and extensive experimental evaluation shows that KQguard is efficient (up to 5.6% overhead) and effective (capable of achieving zero false positives against representative benign workloads after appropriate training and very low false negatives against 125 real-world malware and nine synthetic attacks). ^ In our system, Invariant Monitor and KQguard cooperate together to protect data invariants and KQs in the target kernel. By monitoring these integrity properties, we can detect malware by its violation of these integrity properties during execution.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Subtitle D of the Resource Conservation and Recovery Act (RCRA) requires a post closure period of 30 years for non hazardous wastes in landfills. Post closure care (PCC) activities under Subtitle D include leachate collection and treatment, groundwater monitoring, inspection and maintenance of the final cover, and monitoring to ensure that landfill gas does not migrate off site or into on site buildings. The decision to reduce PCC duration requires exploration of a performance based methodology to Florida landfills. PCC should be based on whether the landfill is a threat to human health or the environment. Historically no risk based procedure has been available to establish an early end to PCC. Landfill stability depends on a number of factors that include variables that relate to operations both before and after the closure of a landfill cell. Therefore, PCC decisions should be based on location specific factors, operational factors, design factors, post closure performance, end use, and risk analysis. The question of appropriate PCC period for Florida’s landfills requires in depth case studies focusing on the analysis of the performance data from closed landfills in Florida. Based on data availability, Davie Landfill was identified as case study site for a case by case analysis of landfill stability. The performance based PCC decision system developed by Geosyntec Consultants was used for the assessment of site conditions to project PCC needs. The available data for leachate and gas quantity and quality, ground water quality, and cap conditions were evaluated. The quality and quantity data for leachate and gas were analyzed to project the levels of pollutants in leachate and groundwater in reference to maximum contaminant level (MCL). In addition, the projected amount of gas quantity was estimated. A set of contaminants (including metals and organics) were identified as contaminants detected in groundwater for health risk assessment. These contaminants were selected based on their detection frequency and levels in leachate and ground water; and their historical and projected trends. During the evaluations a range of discrepancies and problems that related to the collection and documentation were encountered and possible solutions made. Based on the results of PCC performance integrated with risk assessment, projection of future PCC monitoring needs and sustainable waste management options were identified. According to these results, landfill gas monitoring can be terminated, leachate and groundwater monitoring for parameters above MCL and surveying of the cap integrity should be continued. The parameters which cause longer monitoring periods can be eliminated for the future sustainable landfills. As a conclusion, 30 year PCC period can be reduced for some of the landfill components based on their potential impacts to human health and environment (HH&E).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dataset contains the collection of available published paired Uk'37 and Tex86 records spanning multi-millennial to multi-million year time scales, as well as a collection of Mg/Ca-derived temperatures measured in parallel on surface and subsurface dwelling foraminifera, both used in the analyses of Ho and Laepple, Nature Geoscience 2016. As the signal-to-noise ratios of proxy-derived Holocene temperatures are relatively low, we selected records that contain at least the last deglaciation (oldest sample >18kyr BP).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This mixed method case study explored the capacity challenges of neighbourhood based community centres related to the areas of governance and leadership, program delivery, financial management and human resources. The study involved the examination of three community centres with multi-mandates (i.e., provide programs and services to individuals from pre-school to seniors in the areas of social, educational, recreational and health) and utilized three phases of data collection: 1) surveys with board members; 2) focus groups with all boards and staff; and 3) document review which examined pertinent organizational policies and procedures. Questions were aimed at gaining an understanding of some of the challenges faced by staff and administrators of neighbourhood based community centres, as there a gap in the research in this particular area. Research findings identified a number of related challenges facing non-profit organizations specifically in the areas of funding and staffing and how these challenges impact both day to day operations and longer term sustainability. More research is needed with nonprofit organizations that have these broader mandates and diverse operational challenges, hence greater capacity building challenges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The MAREDAT atlas covers 11 types of plankton, ranging in size from bacteria to jellyfish. Together, these plankton groups determine the health and productivity of the global ocean and play a vital role in the global carbon cycle. Working within a uniform and consistent spatial and depth grid (map) of the global ocean, the researchers compiled thousands and tens of thousands of data points to identify regions of plankton abundance and scarcity as well as areas of data abundance and scarcity. At many of the grid points, the MAREDAT team accomplished the difficult conversion from abundance (numbers of organisms) to biomass (carbon mass of organisms). The MAREDAT atlas provides an unprecedented global data set for ecological and biochemical analysis and modeling as well as a clear mandate for compiling additional existing data and for focusing future data gathering efforts on key groups in key areas of the ocean. The present collection presents the original data sets used to compile Global distributions of diazotrophs abundance, biomass and nitrogen fixation rates

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This data set comprises time series of aboveground community plant biomass (Sown plant community, Weed plant community, Dead plant material, and Unidentified plant material; all measured in biomass as dry weight) and species-specific biomass from the sown species of several experiments at the field site of a large grassland biodiversity experiment (the Jena Experiment; see further details below). Aboveground community biomass was normally harvested twice a year just prior to mowing (during peak standing biomass twice a year, generally in May and August; in 2002 only once in September) on all experimental plots in the Jena Experiment. This was done by clipping the vegetation at 3 cm above ground in up to four rectangles of 0.2 x 0.5 m per large plot. The location of these rectangles was assigned by random selection of new coordinates every year within the core area of the plots. The positions of the rectangles within plots were identical for all plots. The harvested biomass was sorted into categories: individual species for the sown plant species, weed plant species (species not sown at the particular plot), detached dead plant material (i.e., dead plant material in the data file), and remaining plant material that could not be assigned to any category (i.e., unidentified plant material in the data file). All biomass was dried to constant weight (70°C, >= 48 h) and weighed. Sown plant community biomass was calculated as the sum of the biomass of the individual sown species. The data for individual samples and the mean over samples for the biomass measures on the community level are given. Overall, analyses of the community biomass data have identified species richness as well as functional group composition as important drivers of a positive biodiversity-productivity relationship. The following series of datasets are contained in this collection: 1. Plant biomass form the Main Experiment: In the Main Experiment, 82 grassland plots of 20 x 20 m were established from a pool of 60 species belonging to four functional groups (grasses, legumes, tall and small herbs). In May 2002, varying numbers of plant species from this species pool were sown into the plots to create a gradient of plant species richness (1, 2, 4, 8, 16 and 60 species) and functional richness (1, 2, 3, 4 functional groups). 2. Plant biomass from the Dominance Experiment: In the Dominance Experiment, 206 grassland plots of 3.5 x 3.5 m were established from a pool of 9 species that can be dominant in semi-natural grassland communities of the study region. In May 2002, varying numbers of plant species from this species pool were sown into the plots to create a gradient of plant species richness (1, 2, 3, 4, 6, and 9 species). 3. Plant biomass from the monoculture plots: In the monoculture plots the sown plant community contains only a single species per plot and this species is a different one for each plot. Which species has been sown in which plot is stated in the plot information table for monocultures (see further details below). The monoculture plots of 3.5 x 3.5 m were established for all of the 60 plant species of the Jena Experiment species pool with two replicates per species like the other experiments in May 2002. All plots were maintained by bi-annual weeding and mowing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Government communication is an important management tool during a public health crisis, but understanding its impact is difficult. Strategies may be adjusted in reaction to developments on the ground and it is challenging to evaluate the impact of communication separately from other crisis management activities. Agent-based modeling is a well-established research tool in social science to respond to similar challenges. However, there have been few such models in public health. We use the example of the TELL ME agent-based model to consider ways in which a non-predictive policy model can assist policy makers. This model concerns individuals’ protective behaviors in response to an epidemic, and the communication that influences such behavior. Drawing on findings from stakeholder workshops and the results of the model itself, we suggest such a model can be useful: (i) as a teaching tool, (ii) to test theory, and (iii) to inform data collection. We also plot a path for development of similar models that could assist with communication planning for epidemics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The neonatal and pediatric antimicrobial point prevalence survey (PPS) of the Antibiotic Resistance and Prescribing in European Children project (http://www.arpecproject.eu/) aims to standardize a method for surveillance of antimicrobial use in children and neonates admitted to the hospital within Europe. This article describes the audit criteria used and reports overall country-specific proportions of antimicrobial use. An analytical review presents methodologies on antimicrobial use.

METHODS: A 1-day PPS on antimicrobial use in hospitalized children was organized in September 2011, using a previously validated and standardized method. The survey included all inpatient pediatric and neonatal beds and identified all children receiving an antimicrobial treatment on the day of survey. Mandatory data were age, gender, (birth) weight, underlying diagnosis, antimicrobial agent, dose and indication for treatment. Data were entered through a web-based system for data-entry and reporting, based on the WebPPS program developed for the European Surveillance of Antimicrobial Consumption project.

RESULTS: There were 2760 and 1565 pediatric versus 1154 and 589 neonatal inpatients reported among 50 European (n = 14 countries) and 23 non-European hospitals (n = 9 countries), respectively. Overall, antibiotic pediatric and neonatal use was significantly higher in non-European (43.8%; 95% confidence interval [CI]: 41.3-46.3% and 39.4%; 95% CI: 35.5-43.4%) compared with that in European hospitals (35.4; 95% CI: 33.6-37.2% and 21.8%; 95% CI: 19.4-24.2%). Proportions of antibiotic use were highest in hematology/oncology wards (61.3%; 95% CI: 56.2-66.4%) and pediatric intensive care units (55.8%; 95% CI: 50.3-61.3%).

CONCLUSIONS: An Antibiotic Resistance and Prescribing in European Children standardized web-based method for a 1-day PPS was successfully developed and conducted in 73 hospitals worldwide. It offers a simple, feasible and sustainable way of data collection that can be used globally.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The astonishing development of diverse and different hardware platforms is twofold: on one side, the challenge for the exascale performance for big data processing and management; on the other side, the mobile and embedded devices for data collection and human machine interaction. This drove to a highly hierarchical evolution of programming models. GVirtuS is the general virtualization system developed in 2009 and firstly introduced in 2010 enabling a completely transparent layer among GPUs and VMs. This paper shows the latest achievements and developments of GVirtuS, now supporting CUDA 6.5, memory management and scheduling. Thanks to the new and improved remoting capabilities, GVirtus now enables GPU sharing among physical and virtual machines based on x86 and ARM CPUs on local workstations,computing clusters and distributed cloud appliances.