990 resultados para Android (Electronic resource)
Resumo:
With the development of electronic devices, more and more mobile clients are connected to the Internet and they generate massive data every day. We live in an age of “Big Data”, and every day we generate hundreds of million magnitude data. By analyzing the data and making prediction, we can carry out better development plan. Unfortunately, traditional computation framework cannot meet the demand, so the Hadoop would be put forward. First the paper introduces the background and development status of Hadoop, compares the MapReduce in Hadoop 1.0 and YARN in Hadoop 2.0, and analyzes the advantages and disadvantages of them. Because the resource management module is the core role of YARN, so next the paper would research about the resource allocation module including the resource management, resource allocation algorithm, resource preemption model and the whole resource scheduling process from applying resource to finishing allocation. Also it would introduce the FIFO Scheduler, Capacity Scheduler, and Fair Scheduler and compare them. The main work has been done in this paper is researching and analyzing the Dominant Resource Fair algorithm of YARN, putting forward a maximum resource utilization algorithm based on Dominant Resource Fair algorithm. The paper also provides a suggestion to improve the unreasonable facts in resource preemption model. Emphasizing “fairness” during resource allocation is the core concept of Dominant Resource Fair algorithm of YARM. Because the cluster is multiple users and multiple resources, so the user’s resource request is multiple too. The DRF algorithm would divide the user’s resources into dominant resource and normal resource. For a user, the dominant resource is the one whose share is highest among all the request resources, others are normal resource. The DRF algorithm requires the dominant resource share of each user being equal. But for these cases where different users’ dominant resource amount differs greatly, emphasizing “fairness” is not suitable and can’t promote the resource utilization of the cluster. By analyzing these cases, this thesis puts forward a new allocation algorithm based on DRF. The new algorithm takes the “fairness” into consideration but not the main principle. Maximizing the resource utilization is the main principle and goal of the new algorithm. According to comparing the result of the DRF and new algorithm based on DRF, we found that the new algorithm has more high resource utilization than DRF. The last part of the thesis is to install the environment of YARN and use the Scheduler Load Simulator (SLS) to simulate the cluster environment.
Resumo:
Nowadays there is almost no crime committed without a trace of digital evidence, and since the advanced functionality of mobile devices today can be exploited to assist in crime, the need for mobile forensics is imperative. Many of the mobile applications available today, including internet browsers, will request the user’s permission to access their current location when in use. This geolocation data is subsequently stored and managed by that application's underlying database files. If recovered from a device during a forensic investigation, such GPS evidence and track points could hold major evidentiary value for a case. The aim of this paper is to examine and compare to what extent geolocation data is available from the iOS and Android operating systems. We focus particularly on geolocation data recovered from internet browsing applications, comparing the native Safari and Browser apps with Google Chrome, downloaded on to both platforms. All browsers were used over a period of several days at various locations to generate comparable test data for analysis. Results show considerable differences not only in the storage locations and formats, but also in the amount of geolocation data stored by different browsers and on different operating systems.
Resumo:
Despite the efforts to better manage biosolids field application programs, biosolids managers still lack of efficient and reliable tools to apply large quantities of material while avoiding odor complaints. Objectives of this research were to determine the capabilities of an electronic nose in supporting process monitoring of biosolids production and, to compare odor characteristics of biosolids produced through thermal-hydrolysis anaerobic digestion (TH-AD) to those of alkaline stabilization in the plant, under storage and in the field. A method to quantify key odorants was developed and full scale sampling and laboratory simulations were performed. The portable electronic nose (PEN3) was tested for its capabilities of distinguishing alkali dosages in the biosolids production process. Frequency of recognition of unknown samples was tested achieving highest accuracy of 81.1%. This work exposed the need for a different and more sensitive electronic nose to assure its applicability at full scale for this process. GC-MS results were consistent with those reported in literature and helped to elucidate the behavior of the pattern recognition of the PEN3. Odor characterization of TH-AD and alkaline stabilized biosolids was achieved using olfactometry measurements and GC-MS. Dilution-to-threshold of TH-AD biosolids increased under storage conditions but no correlation was found with the target compounds. The presence of furan and three methylated homologues in TH-AD biosolids was reported for the first time proposing that these compounds are produced during thermal hydrolysis process however, additional research is needed to fully describe the formation of these compounds and the increase in odors. Alkaline stabilized biosolids reported similar odor concentration but did not increase and the ‘fishy’ odor from trimethylamine emissions resulted in more offensive and unpleasant odors when compared to TH-AD. Alkaline stabilized biosolids showed a spike in sulfur and trimethylamine after 3 days of field application when the alkali addition was not sufficient to meet regulatory standards. Concentrations of target compounds from field application of TH-AD biosolids gradually decreased to below the odor threshold after 3 days. This work increased the scientific understanding on odor characteristics and behavior of two types of biosolids and on the application of electronic noses to the environmental engineering field.
Resumo:
This phenomenological study explored how HR professionals who identified themselves as facilitators of strategic HRD (SHRD) perceived the experience of being an organizational agent-downsizing survivor. Criterion and snowball sampling were used to recruit 15 participants for this study. A semi-structured interview guide was used to interview participants. Creswell’s (2007) simplified version of Moustakas’s (1994) Modification of the Stevick-Colaizzi-Keen Method of Analysis of Phenomenological Data was used to analyze the data. Four main themes and corresponding sub-themes emerged from an inductive data analysis. The four main themes were a) the emotionality of downsizing, b) feeling responsible, c) choice and control, and d) possibilities for growth. Participants perceived downsizing as an emotional organizational change event that required them to manage their own emotions while helping others do the same. They performed their roles within an organizational atmosphere that was perceived as chaotic and filled with apprehension, shock, and a sense of ongoing loss, sadness and grieving. They sometimes experienced guilt and doubt and felt deceptive for having to keep secrets from others when planning for downsizing. Participants felt a strong sense of responsibility to protect employees emotionally, balance employee and organizational interests, and try to ensure the best outcomes for both. Often being there for others meant that they put on their games faces and took care of themselves last. Participants spoke of the importance of choosing one’s attitude, being proactive rather than reactive, and finding ways to regain control in the midst of organizational crisis. They also perceived that although downsizing was emotionally difficult to go through that it provided possibilities for self, employee, and organizational growth.
Resumo:
The increasing needs for computational power in areas such as weather simulation, genomics or Internet applications have led to sharing of geographically distributed and heterogeneous resources from commercial data centers and scientific institutions. Research in the areas of utility, grid and cloud computing, together with improvements in network and hardware virtualization has resulted in methods to locate and use resources to rapidly provision virtual environments in a flexible manner, while lowering costs for consumers and providers. However, there is still a lack of methodologies to enable efficient and seamless sharing of resources among institutions. In this work, we concentrate in the problem of executing parallel scientific applications across distributed resources belonging to separate organizations. Our approach can be divided in three main points. First, we define and implement an interoperable grid protocol to distribute job workloads among partners with different middleware and execution resources. Second, we research and implement different policies for virtual resource provisioning and job-to-resource allocation, taking advantage of their cooperation to improve execution cost and performance. Third, we explore the consequences of on-demand provisioning and allocation in the problem of site-selection for the execution of parallel workloads, and propose new strategies to reduce job slowdown and overall cost.
Resumo:
The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity. We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.
Resumo:
With the exponential growth of the usage of web-based map services, the web GIS application has become more and more popular. Spatial data index, search, analysis, visualization and the resource management of such services are becoming increasingly important to deliver user-desired Quality of Service. First, spatial indexing is typically time-consuming and is not available to end-users. To address this, we introduce TerraFly sksOpen, an open-sourced an Online Indexing and Querying System for Big Geospatial Data. Integrated with the TerraFly Geospatial database [1-9], sksOpen is an efficient indexing and query engine for processing Top-k Spatial Boolean Queries. Further, we provide ergonomic visualization of query results on interactive maps to facilitate the user’s data analysis. Second, due to the highly complex and dynamic nature of GIS systems, it is quite challenging for the end users to quickly understand and analyze the spatial data, and to efficiently share their own data and analysis results with others. Built on the TerraFly Geo spatial database, TerraFly GeoCloud is an extra layer running upon the TerraFly map and can efficiently support many different visualization functions and spatial data analysis models. Furthermore, users can create unique URLs to visualize and share the analysis results. TerraFly GeoCloud also enables the MapQL technology to customize map visualization using SQL-like statements [10]. Third, map systems often serve dynamic web workloads and involve multiple CPU and I/O intensive tiers, which make it challenging to meet the response time targets of map requests while using the resources efficiently. Virtualization facilitates the deployment of web map services and improves their resource utilization through encapsulation and consolidation. Autonomic resource management allows resources to be automatically provisioned to a map service and its internal tiers on demand. v-TerraFly are techniques to predict the demand of map workloads online and optimize resource allocations, considering both response time and data freshness as the QoS target. The proposed v-TerraFly system is prototyped on TerraFly, a production web map service, and evaluated using real TerraFly workloads. The results show that v-TerraFly can accurately predict the workload demands: 18.91% more accurate; and efficiently allocate resources to meet the QoS target: improves the QoS by 26.19% and saves resource usages by 20.83% compared to traditional peak load-based resource allocation.
Resumo:
Two single crystalline surfaces of Au vicinal to the (111) plane were modified with Pt and studied using scanning tunneling microscopy (STM) and X-ray photoemission spectroscopy (XPS) in ultra-high vacuum environment. The vicinal surfaces studied are Au(332) and Au(887) and different Pt coverage (θPt) were deposited on each surface. From STM images we determine that Pt deposits on both surfaces as nanoislands with heights ranging from 1 ML to 3 ML depending on θPt. On both surfaces the early growth of Pt ad-islands occurs at the lower part of the step edge, with Pt ad-atoms being incorporated into the steps in some cases. XPS results indicate that partial alloying of Pt occurs at the interface at room temperature and at all coverage, as suggested by the negative chemical shift of Pt 4f core line, indicating an upward shift of the d-band center of the alloyed Pt. Also, the existence of a segregated Pt phase especially at higher coverage is detected by XPS. Sample annealing indicates that the temperature rise promotes a further incorporation of Pt atoms into the Au substrate as supported by STM and XPS results. Additionally, the catalytic activity of different PtAu systems reported in the literature for some electrochemical reactions is discussed considering our findings.
Resumo:
Resource specialisation, although a fundamental component of ecological theory, is employed in disparate ways. Most definitions derive from simple counts of resource species. We build on recent advances in ecophylogenetics and null model analysis to propose a concept of specialisation that comprises affinities among resources as well as their co-occurrence with consumers. In the distance-based specialisation index (DSI), specialisation is measured as relatedness (phylogenetic or otherwise) of resources, scaled by the null expectation of random use of locally available resources. Thus, specialists use significantly clustered sets of resources, whereas generalists use over-dispersed resources. Intermediate species are classed as indiscriminate consumers. The effectiveness of this approach was assessed with differentially restricted null models, applied to a data set of 168 herbivorous insect species and their hosts. Incorporation of plant relatedness and relative abundance greatly improved specialisation measures compared to taxon counts or simpler null models, which overestimate the fraction of specialists, a problem compounded by insufficient sampling effort. This framework disambiguates the concept of specialisation with an explicit measure applicable to any mode of affinity among resource classes, and is also linked to ecological and evolutionary processes. This will enable a more rigorous deployment of ecological specialisation in empirical and theoretical studies.
Resumo:
The overall prevalence of infertility was estimated to be 3.5-16.7% in developing countries and 6.9-9.3% in developed countries. Furthermore, according to reports from some regions of sub-Saharan Africa, the prevalence rate is 30-40%. The consequences of infertility and how it affects the lives of women in poor-resource settings, particularly in developing countries, has become an important issue to be discussed in reproductive health. In some societies, the inability to fulfill the desire to have children makes life difficult for the infertile couple. In many regions, infertility is considered a tragedy that affects not only the infertile couple or woman, but the entire family. This is a position paper which encompasses a review of the needs of low-income infertile couples, mainly those living in developing countries, regarding access to infertility care, including ART and initiatives to provide ART at low or affordable cost. Information was gathered from the databases MEDLINE, CENTRAL, POPLINE, EMBASE, LILACS, and ICTRP with the key words: infertility, low income, assisted reproductive technologies, affordable cost, low cost. There are few initiatives geared toward implementing ART procedures at low cost or at least at affordable cost in low-income populations. Nevertheless, from recent studies, possibilities have emerged for new low-cost initiatives that can help millions of couples to achieve the desire of having a biological child. It is necessary for healthcare professionals and policymakers to take into account these new initiatives in order to implement ART in resource-constrained settings.
Resumo:
INTRODUCTION: Open access publishing is becoming increasingly popular within the biomedical sciences. SciELO, the Scientific Electronic Library Online, is a digital library covering a selected collection of Brazilian scientific journals many of which provide open access to full-text articles.This library includes a number of dental journals some of which may include reports of clinical trials in English, Portuguese and/or Spanish. Thus, SciELO could play an important role as a source of evidence for dental healthcare interventions especially if it yields a sizeable number of high quality reports. OBJECTIVE: The aim of this study was to identify reports of clinical trials by handsearching of dental journals that are accessible through SciELO, and to assess the overall quality of these reports. MATERIAL AND METHODS: Electronic versions of six Brazilian dental Journals indexed in SciELO were handsearched at www.scielo.br in September 2008. Reports of clinical trials were identified and classified as controlled clinical trials (CCTs - prospective, experimental studies comparing 2 or more healthcare interventions in human beings) or randomized controlled trials (RCTs - a random allocation method is clearly reported), according to Cochrane eligibility criteria. CRITERIA TO ASSESS METHODOLOGICAL QUALITY INCLUDED: method of randomization, concealment of treatment allocation, blinded outcome assessment, handling of withdrawals and losses and whether an intention-to-treat analysis had been carried out. RESULTS: The search retrieved 33 CCTs and 43 RCTs. A majority of the reports provided no description of either the method of randomization (75.3%) or concealment of the allocation sequence (84.2%). Participants and outcome assessors were reported as blinded in only 31.2% of the reports. Withdrawals and losses were only clearly described in 6.5% of the reports and none mentioned an intention-to-treat analysis or any similar procedure. CONCLUSIONS: The results of this study indicate that a substantial number of reports of trials and systematic reviews are available in the dental journals listed in SciELO, and that these could provide valuable evidence for clinical decision making. However, it is clear that the quality of a number of these reports is of some concern and that improvement in the conduct and reporting of these trials could be achieved if authors adhered to internationally accepted guidelines, e.g. the CONSORT statement.
Resumo:
The present study compared the accuracy of three electronic apex locators (EALs) - Elements Diagnostic®, Root ZX® and Apex DSP® - in the presence of different irrigating solutions (0.9% saline solution and 1% sodium hypochlorite). The electronic measurements were carried out by three examiners, using twenty extracted human permanent maxillary central incisors. A size 10 K file was introduced into the root canals until reaching the 0.0 mark, and was subsequently retracted to the 1.0 mark. The gold standard (GS) measurement was obtained by combining visual and radiographic methods, and was set 1 mm short of the apical foramen. Electronic length values closer to the GS (± 0.5 mm) were considered as accurate measures. Intraclass correlation coefficients (ICCs) were used to verify inter-examiner agreement. The comparison among the EALs was performed using the McNemar and Kruskal-Wallis tests (p < 0.05). The ICCs were generally high, ranging from 0.8859 to 0.9657. Similar results were observed for the percentage of electronic measurements closer to the GS obtained with the Elements Diagnostic® and the Root ZX® EALs (p > 0.05), independent of the irrigating solutions used. The measurements taken with these two EALs were more accurate than those taken with Apex DSP®, regardless of the irrigating solution used (p < 0.05). It was concluded that Elements Diagnostic® and Root ZX® apex locators are able to locate the cementum-dentine junction more precisely than Apex DSP®. The presence of irrigating solutions does not interfere with the performance of the EALs.
Resumo:
The n→π* absorption transition of formaldehyde in water is analyzed using combined and sequential classical Monte Carlo (MC) simulations and quantum mechanics (QM) calculations. MC simulations generate the liquid solute-solvent structures for subsequent QM calculations. Using time-dependent density functional theory in a localized set of gaussian basis functions (TD-DFT/6-311++G(d,p)) calculations are made on statistically relevant configurations to obtain the average solvatochromic shift. All results presented here use the electrostatic embedding of the solvent. The statistically converged average result obtained of 2300 cm-1 is compared to previous theoretical results available. Analysis is made of the effective dipole moment of the hydrogen-bonded shell and how it could be held responsible for the polarization of the solvent molecules in the outer solvation shells.
Resumo:
A new tetraruthenated copper(II)-tetra(3,4-pyridyl)porphyrazine species, [CuTRPyPz]4+, has been synthesized and fully characterized by means of analytical, spectroscopic and electrochemical techniques. This À-conjugated system contrasts with the related meso-tetrapyridylporphyrins by exhibiting strong electronic interaction between the coordinated peripheral complexes and the central ring. Based on favorable À-stacking and electrostatic interactions, layer-by-layer assembled films were successfully generated from the appropriate combination of [CuTRPyPz]4+ with copper(II)-tetrasulfonated phtalocyanine, [CuTSPc]4-. Their conducting and electrocatalytic properties were investigated by means of impedance spectroscopy and rotating disc voltammetry, exhibiting metallic behavior near the Ru(III/II) redox potential, as well as enhanced catalytic activity for the oxidation of nitrite and sulphite ions.
Resumo:
Objective: Although some scientific information on electronic body protectors in taekwondo is available, no research has been done to assess the impact of kicks in a competitive situation. The purpose of this study, then, was to assess the energy absorbed by these protectors from kicks performed in an actual taekwondo competition. Methods: Subjects consisted of junior (14-17 years) and senior (>= 18 years) male taekwondo-in, who participated in an open tournament. Data on the energy imparted by valid kicks in Joules (1) were collected from a public visual electronic monitor. Results: Energy was higher for the seniors: 264.31 +/- 56.63 J versus 224.38 +/- 48.23 J for the juniors (eta(2) = 0.121). The seniors scored lower in percent impact but the effect was trivial: 123.46 +/- 24.77% versus 136.70 +/- 26.33% (eta(2) = 0.087). Conclusions: The difference between senior and junior taekwondo-in in absolute energy generated was small, while the difference in relative energy impact was trivial in favour of the junior taekwondo athletes.