88 resultados para slifetime-based garbage collection
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.
Resumo:
The increasing needs for computational power in areas such as weather simulation, genomics or Internet applications have led to sharing of geographically distributed and heterogeneous resources from commercial data centers and scientific institutions. Research in the areas of utility, grid and cloud computing, together with improvements in network and hardware virtualization has resulted in methods to locate and use resources to rapidly provision virtual environments in a flexible manner, while lowering costs for consumers and providers. ^ However, there is still a lack of methodologies to enable efficient and seamless sharing of resources among institutions. In this work, we concentrate in the problem of executing parallel scientific applications across distributed resources belonging to separate organizations. Our approach can be divided in three main points. First, we define and implement an interoperable grid protocol to distribute job workloads among partners with different middleware and execution resources. Second, we research and implement different policies for virtual resource provisioning and job-to-resource allocation, taking advantage of their cooperation to improve execution cost and performance. Third, we explore the consequences of on-demand provisioning and allocation in the problem of site-selection for the execution of parallel workloads, and propose new strategies to reduce job slowdown and overall cost.^
Resumo:
With hundreds of millions of users reporting locations and embracing mobile technologies, Location Based Services (LBSs) are raising new challenges. In this dissertation, we address three emerging problems in location services, where geolocation data plays a central role. First, to handle the unprecedented growth of generated geolocation data, existing location services rely on geospatial database systems. However, their inability to leverage combined geographical and textual information in analytical queries (e.g. spatial similarity joins) remains an open problem. To address this, we introduce SpsJoin, a framework for computing spatial set-similarity joins. SpsJoin handles combined similarity queries that involve textual and spatial constraints simultaneously. LBSs use this system to tackle different types of problems, such as deduplication, geolocation enhancement and record linkage. We define the spatial set-similarity join problem in a general case and propose an algorithm for its efficient computation. Our solution utilizes parallel computing with MapReduce to handle scalability issues in large geospatial databases. Second, applications that use geolocation data are seldom concerned with ensuring the privacy of participating users. To motivate participation and address privacy concerns, we propose iSafe, a privacy preserving algorithm for computing safety snapshots of co-located mobile devices as well as geosocial network users. iSafe combines geolocation data extracted from crime datasets and geosocial networks such as Yelp. In order to enhance iSafe's ability to compute safety recommendations, even when crime information is incomplete or sparse, we need to identify relationships between Yelp venues and crime indices at their locations. To achieve this, we use SpsJoin on two datasets (Yelp venues and geolocated businesses) to find venues that have not been reviewed and to further compute the crime indices of their locations. Our results show a statistically significant dependence between location crime indices and Yelp features. Third, review centered LBSs (e.g., Yelp) are increasingly becoming targets of malicious campaigns that aim to bias the public image of represented businesses. Although Yelp actively attempts to detect and filter fraudulent reviews, our experiments showed that Yelp is still vulnerable. Fraudulent LBS information also impacts the ability of iSafe to provide correct safety values. We take steps toward addressing this problem by proposing SpiDeR, an algorithm that takes advantage of the richness of information available in Yelp to detect abnormal review patterns. We propose a fake venue detection solution that applies SpsJoin on Yelp and U.S. housing datasets. We validate the proposed solutions using ground truth data extracted by our experiments and reviews filtered by Yelp.
Resumo:
A prototype 3-dimensional (3D) anode, based on multiwall carbon nanotubes (MWCNTs), for Li-ion batteries (LIBs), with potential use in Electric Vehicles (EVs) was investigated. The unique 3D design of the anode allowed much higher areal mass density of MWCNTs as active materials, resulting in more amount of Li+ ion intake, compared to that of a conventional 2D counterpart. Furthermore, 3D amorphous Si/MWCNTs hybrid structure offered enhancement in electrochemical response (specific capacity 549 mAhg–1 ). Also, an anode stack was fabricated to further increase the areal or volumetric mass density of MWCNTs. An areal mass density of the anode stack 34.9 mg/cm2 was attained, which is 1,342% higher than the value for a single layer 2.6 mg/cm2. Furthermore, the binder-assisted and hot-pressed anode stack yielded the average reversible, stable gravimetric and volumetric specific capacities of 213 mAhg–1 and 265 mAh/cm3, respectively (at 0.5C). Moreover, a large-scale patterned novel flexible 3D MWCNTs-graphene-polyethylene terephthalate (PET) anode structure was prepared. It generated a reversible specific capacity of 153 mAhg–1 at 0.17C and cycling stability of 130 mAhg –1 up to 50 cycles at 1.7C.
Resumo:
Brain is one of the safe sanctuaries for HIV and, in turn, continuously supplies active viruses to the periphery. Additionally, HIV infection in brain results in several mild-to-severe neuro-immunological complications termed neuroAIDS. One-tenth of HIV-infected population is addicted to recreational drugs such as opiates, alcohol, nicotine, marijuana, etc. which share common target-areas in the brain with HIV. Interestingly, intensity of neuropathogenesis is remarkably enhanced due to exposure of recreational drugs during HIV infection. Current treatments to alleviate either the individual or synergistic effects of abusive drugs and HIV on neuronal modulations are less effective at CNS level, basically due to impermeability of therapeutic molecules across blood-brain barrier (BBB). Despite exciting advancement of nanotechnology in drug delivery, existing nanovehicles such as dendrimers, polymers, micelles, etc. suffer from the lack of adequate BBB penetrability before the drugs are engulfed by the reticuloendothelial system cells as well as the uncertainty that if and when the nanocarrier reaches the brain. Therefore, in order to develop a fast, target-specific, safe, and effective approach for brain delivery of anti-addiction, anti-viral and neuroprotective drugs, we exploited the potential of magnetic nanoparticles (MNPs) which, in recent years, has attracted significant importance in biomedical applications. We hypothesize that under the influence of external (non-invasive) magnetic force, MNPs can deliver these drugs across BBB in most effective manner. Accordingly, in this dissertation, I delineated the pharmacokinetics and dynamics of MNPs bound anti-opioid, anti-HIV and neuroprotective drugs for delivery in brain. I have developed a liposome-based novel magnetized nanovehicle which, under the influence of external magnetic forces, can transmigrate and effectively deliver drugs across BBB without compromising its integrity. It is expected that the developed nanoformulations may be of high therapeutic significance for neuroAIDS and for drug addiction as well.
Resumo:
The presences of heavy metals, organic contaminants and natural toxins in natural water bodies pose a serious threat to the environment and the health of living organisms. Therefore, there is a critical need to identify sustainable and environmentally friendly water treatment processes. In this dissertation, I focus on the fundamental studies of advanced oxidation processes and magnetic nano-materials as promising new technologies for water treatments. Advanced oxidation processes employ reactive oxygen species (ROS) which can lead to the mineralization of a number of pollutants and toxins. The rates of formation, steady-state concentrations, and kinetic parameters of hydroxyl radical and singlet oxygen produced by various TiO2 photocatalysts under UV or visible irradiations were measured using selective chemical probes. Hydroxyl radical is the dominant ROS, and its generation is dependent on experimental conditions. The optimal condition for generation of hydroxyl radical by of TiO2 coated glass microspheres is studied by response surface methodology, and the optimal conditions are applied for the degradation of dimethyl phthalate. Singlet oxygen (1O2) also plays an important role for advanced processes, so the degradation of microcystin-LR by rose bengal, an 1O2 sensitizer was studied. The measured bimolecular reaction rate constant between MC-LR and 1O2 is ∼ 106 M-1 s-1 based on competition kinetics with furfuryl alcohol. The typical adsorbent needs separation after the treatment, while magnetic iron oxides can be easily removed by a magnetic field. Maghemite and humic acid coated magnetite (HA-Fe3O4) were synthesized, characterized and applied for chromium(VI) removal. The adsorption of chromium(VI) by maghemite and HA-Fe3O4 follow a pseudo-second-order kinetic process. The adsorption of chromium(VI) by maghemite is accurately modeled using adsorption isotherms, and solution pH and presence of humic acid influence adsorption. Humic acid coated magnetite can adsorb and reduce chromium(VI) to non-toxic chromium (III), and the reaction is not highly dependent on solution pH. The functional groups associated with humic acid act as ligands lead to the Cr(III) complex via a coupled reduction-complexation mechanism. Extended X-ray absorption fine structure spectroscopy demonstrates the Cr(III) in the Cr-loaded HA-Fe 3O4 materials has six neighboring oxygen atoms in an octahedral geometry with average bond lengths of 1.98 Å.
Resumo:
Major portion of hurricane-induced economic loss originates from damages to building structures. The damages on building structures are typically grouped into three main categories: exterior, interior, and contents damage. Although the latter two types of damages, in most cases, cause more than 50% of the total loss, little has been done to investigate the physical damage process and unveil the interdependence of interior damage parameters. Building interior and contents damages are mainly due to wind-driven rain (WDR) intrusion through building envelope defects, breaches, and other functional openings. The limitation of research works and subsequent knowledge gaps, are in most part due to the complexity of damage phenomena during hurricanes and lack of established measurement methodologies to quantify rainwater intrusion. This dissertation focuses on devising methodologies for large-scale experimental simulation of tropical cyclone WDR and measurements of rainwater intrusion to acquire benchmark test-based data for the development of hurricane-induced building interior and contents damage model. Target WDR parameters derived from tropical cyclone rainfall data were used to simulate the WDR characteristics at the Wall of Wind (WOW) facility. The proposed WDR simulation methodology presents detailed procedures for selection of type and number of nozzles formulated based on tropical cyclone WDR study. The simulated WDR was later used to experimentally investigate the mechanisms of rainwater deposition/intrusion in buildings. Test-based dataset of two rainwater intrusion parameters that quantify the distribution of direct impinging raindrops and surface runoff rainwater over building surface — rain admittance factor (RAF) and surface runoff coefficient (SRC), respectively —were developed using common shapes of low-rise buildings. The dataset was applied to a newly formulated WDR estimation model to predict the volume of rainwater ingress through envelope openings such as wall and roof deck breaches and window sill cracks. The validation of the new model using experimental data indicated reasonable estimation of rainwater ingress through envelope defects and breaches during tropical cyclones. The WDR estimation model and experimental dataset of WDR parameters developed in this dissertation work can be used to enhance the prediction capabilities of existing interior damage models such as the Florida Public Hurricane Loss Model (FPHLM).^
Resumo:
The low-frequency electromagnetic compatibility (EMC) is an increasingly important aspect in the design of practical systems to ensure the functional safety and reliability of complex products. The opportunities for using numerical techniques to predict and analyze system's EMC are therefore of considerable interest in many industries. As the first phase of study, a proper model, including all the details of the component, was required. Therefore, the advances in EMC modeling were studied with classifying analytical and numerical models. The selected model was finite element (FE) modeling, coupled with the distributed network method, to generate the model of the converter's components and obtain the frequency behavioral model of the converter. The method has the ability to reveal the behavior of parasitic elements and higher resonances, which have critical impacts in studying EMI problems. For the EMC and signature studies of the machine drives, the equivalent source modeling was studied. Considering the details of the multi-machine environment, including actual models, some innovation in equivalent source modeling was performed to decrease the simulation time dramatically. Several models were designed in this study and the voltage current cube model and wire model have the best result. The GA-based PSO method is used as the optimization process. Superposition and suppression of the fields in coupling the components were also studied and verified. The simulation time of the equivalent model is 80-100 times lower than the detailed model. All tests were verified experimentally. As the application of EMC and signature study, the fault diagnosis and condition monitoring of an induction motor drive was developed using radiated fields. In addition to experimental tests, the 3DFE analysis was coupled with circuit-based software to implement the incipient fault cases. The identification was implemented using ANN for seventy various faulty cases. The simulation results were verified experimentally. Finally, the identification of the types of power components were implemented. The results show that it is possible to identify the type of components, as well as the faulty components, by comparing the amplitudes of their stray field harmonics. The identification using the stray fields is nondestructive and can be used for the setups that cannot go offline and be dismantled
Resumo:
Kernel-level malware is one of the most dangerous threats to the security of users on the Internet, so there is an urgent need for its detection. The most popular detection approach is misuse-based detection. However, it cannot catch up with today's advanced malware that increasingly apply polymorphism and obfuscation. In this thesis, we present our integrity-based detection for kernel-level malware, which does not rely on the specific features of malware. ^ We have developed an integrity analysis system that can derive and monitor integrity properties for commodity operating systems kernels. In our system, we focus on two classes of integrity properties: data invariants and integrity of Kernel Queue (KQ) requests. ^ We adopt static analysis for data invariant detection and overcome several technical challenges: field-sensitivity, array-sensitivity, and pointer analysis. We identify data invariants that are critical to system runtime integrity from Linux kernel 2.4.32 and Windows Research Kernel (WRK) with very low false positive rate and very low false negative rate. We then develop an Invariant Monitor to guard these data invariants against real-world malware. In our experiment, we are able to use Invariant Monitor to detect ten real-world Linux rootkits and nine real-world Windows malware and one synthetic Windows malware. ^ We leverage static and dynamic analysis of kernel and device drivers to learn the legitimate KQ requests. Based on the learned KQ requests, we build KQguard to protect KQs. At runtime, KQguard rejects all the unknown KQ requests that cannot be validated. We apply KQguard on WRK and Linux kernel, and extensive experimental evaluation shows that KQguard is efficient (up to 5.6% overhead) and effective (capable of achieving zero false positives against representative benign workloads after appropriate training and very low false negatives against 125 real-world malware and nine synthetic attacks). ^ In our system, Invariant Monitor and KQguard cooperate together to protect data invariants and KQs in the target kernel. By monitoring these integrity properties, we can detect malware by its violation of these integrity properties during execution.^
Resumo:
Subtitle D of the Resource Conservation and Recovery Act (RCRA) requires a post closure period of 30 years for non hazardous wastes in landfills. Post closure care (PCC) activities under Subtitle D include leachate collection and treatment, groundwater monitoring, inspection and maintenance of the final cover, and monitoring to ensure that landfill gas does not migrate off site or into on site buildings. The decision to reduce PCC duration requires exploration of a performance based methodology to Florida landfills. PCC should be based on whether the landfill is a threat to human health or the environment. Historically no risk based procedure has been available to establish an early end to PCC. Landfill stability depends on a number of factors that include variables that relate to operations both before and after the closure of a landfill cell. Therefore, PCC decisions should be based on location specific factors, operational factors, design factors, post closure performance, end use, and risk analysis. The question of appropriate PCC period for Florida’s landfills requires in depth case studies focusing on the analysis of the performance data from closed landfills in Florida. Based on data availability, Davie Landfill was identified as case study site for a case by case analysis of landfill stability. The performance based PCC decision system developed by Geosyntec Consultants was used for the assessment of site conditions to project PCC needs. The available data for leachate and gas quantity and quality, ground water quality, and cap conditions were evaluated. The quality and quantity data for leachate and gas were analyzed to project the levels of pollutants in leachate and groundwater in reference to maximum contaminant level (MCL). In addition, the projected amount of gas quantity was estimated. A set of contaminants (including metals and organics) were identified as contaminants detected in groundwater for health risk assessment. These contaminants were selected based on their detection frequency and levels in leachate and ground water; and their historical and projected trends. During the evaluations a range of discrepancies and problems that related to the collection and documentation were encountered and possible solutions made. Based on the results of PCC performance integrated with risk assessment, projection of future PCC monitoring needs and sustainable waste management options were identified. According to these results, landfill gas monitoring can be terminated, leachate and groundwater monitoring for parameters above MCL and surveying of the cap integrity should be continued. The parameters which cause longer monitoring periods can be eliminated for the future sustainable landfills. As a conclusion, 30 year PCC period can be reduced for some of the landfill components based on their potential impacts to human health and environment (HH&E).
Resumo:
Elemental analysis can become an important piece of evidence to assist the solution of a case. The work presented in this dissertation aims to evaluate the evidential value of the elemental composition of three particular matrices: ink, paper and glass. In the first part of this study, the analytical performance of LIBS and LA-ICP-MS methods was evaluated for paper, writing inks and printing inks. A total of 350 ink specimens were examined including black and blue gel inks, ballpoint inks, inkjets and toners originating from several manufacturing sources and/or batches. The paper collection set consisted of over 200 paper specimens originating from 20 different paper sources produced by 10 different plants. Micro-homogeneity studies show smaller variation of elemental compositions within a single source (i.e., sheet, pen or cartridge) than the observed variation between different sources (i.e., brands, types, batches). Significant and detectable differences in the elemental profile of the inks and paper were observed between samples originating from different sources (discrimination of 87 – 100% of samples, depending on the sample set under investigation and the method applied). These results support the use of elemental analysis, using LA-ICP-MS and LIBS, for the examination of documents and provide additional discrimination to the currently used techniques in document examination. In the second part of this study, a direct comparison between four analytical methods (µ-XRF, solution-ICP-MS, LA-ICP-MS and LIBS) was conducted for glass analyses using interlaboratory studies. The data provided by 21 participants were used to assess the performance of the analytical methods in associating glass samples from the same source and differentiating different sources, as well as the use of different match criteria (confidence interval (±6s, ±5s, ±4s, ±3s, ±2s), modified confidence interval, t-test (sequential univariate, p=0.05 and p=0.01), t-test with Bonferroni correction (for multivariate comparisons), range overlap, and Hotelling’s T2 tests. Error rates (Type 1 and Type 2) are reported for the use of each of these match criteria and depend on the heterogeneity of the glass sources, the repeatability between analytical measurements, and the number of elements that were measured. The study provided recommendations for analytical performance-based parameters for µ-XRF and LA-ICP-MS as well as the best performing match criteria for both analytical techniques, which can be applied now by forensic glass examiners.
Resumo:
The overall purpose of this collected papers dissertation was to examine the utility of a cognitive apprenticeship-based instructional coaching (CAIC) model for improving the science teaching efficacy beliefs (STEB) of preservice and inservice elementary teachers. Many of these teachers perceive science as a difficult subject and feel inadequately prepared to teach it. However, teacher efficacy beliefs have been noted as the strongest indicator of teacher quality, the variable most highly correlated with student achievement outcomes. The literature is scarce on strong, evidence-based theoretical models for improving STEB.^ This dissertation is comprised of two studies. STUDY #1 was a sequential explanatory mixed-methods study investigating the impact of a reformed CAIC elementary science methods course on the STEB of 26 preservice teachers. Data were collected using the Science Teaching Efficacy Belief Instrument (STEBI-B) and from six post-course interviews. A statistically significant increase in STEB was observed in the quantitative strand. The qualitative data suggested that the preservice teachers perceived all of the CAIC methods as influential, but the significance of each method depended on their unique needs and abilities. ^ STUDY #2 was a participatory action research case study exploring the utility of a CAIC professional development program for improving the STEB of five Bahamian inservice teachers and their competency in implementing an inquiry-based curriculum. Data were collected from pre- and post-interviews and two focus group interviews. Overall, the inservice teachers perceived the intervention as highly effective. The scaffolding and coaching were the CAIC methods portrayed as most influential in developing their STEB, highlighting the importance of interpersonal relationship aspects in successful instructional coaching programs. The teachers also described the CAIC approach as integral in supporting their learning to implement the new inquiry-based curriculum. ^ The overall findings hold important implications for science education reform, including its potential to influence how preservice teacher training and inservice teacher professional development in science are perceived and implemented. Additionally, given the noteworthy results obtained over the relatively short durations, CAIC interventions may also provide an effective means of achieving improvements in preservice and inservice teachers’ STEB more expeditiously than traditional approaches.^
Resumo:
The examination of Workplace Aggression as a global construct conceptualization has gained considerable attention over the past few years as organizations work to better understand and address the occurrence and consequences of this challenging construct. The purpose of this dissertation is to build on previous efforts to validate the appropriateness and usefulness of a global conceptualization of the workplace aggression construct. ^ This dissertation has been broken up into two parts: Part 1 utilized a Confirmatory Factor Analysis approach in order to assess the existence of workplace aggression as a global construct; Part 2 utilized a series of correlational analyses to examine the relationship between a selection of commonly experienced individual strain based outcomes and the global construct conceptualization assessed in Part 1. Participants were a diverse sample of 219 working individuals from Amazon’s Mechanical Turk participant pool. ^ Results of Part 1 did not show support for a one-factor global construct conceptualization of the workplace aggression construct. However, support was shown for a higher-order five-factor model of the construct, suggesting that it may be possible to conceptualize workplace aggression as an overarching construct that is made up of separate workplace aggression constructs. Results of Part 2 showed support for the relationships between an existing global construct workplace aggression conceptualization and a series of strain-based outcomes. Utilizing correlational analyses, additional post-hoc analyses showed that individual factors such as emotional intelligence and personality are related to the experience of workplace aggression. Further, utilizing moderated regression analysis, the results demonstrated that individuals experiencing high levels of workplace aggression reported higher job satisfaction when they felt strongly that the aggressive act was highly visible, and similarly, when they felt that there was a clear intent to cause harm. ^ Overall, the findings of this dissertation do support the need for a simplification of its current state of measurement. Future research should continue to examine workplace aggression in an effort to shed additional light on the structure and usefulness of this complex construct.^