983 resultados para Data compression


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work aims to promote integrity in autonomous perceptual systems, with a focus on outdoor unmanned ground vehicles equipped with a camera and a 2D laser range finder. A method to check for inconsistencies between the data provided by these two heterogeneous sensors is proposed and discussed. First, uncertainties in the estimated transformation between the laser and camera frames are evaluated and propagated up to the projection of the laser points onto the image. Then, for each pair of laser scan-camera image acquired, the information at corners of the laser scan is compared with the content of the image, resulting in a likelihood of correspondence. The result of this process is then used to validate segments of the laser scan that are found to be consistent with the image, while inconsistent segments are rejected. Experimental results illustrate how this technique can improve the reliability of perception in challenging environmental conditions, such as in the presence of airborne dust.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Server consolidation using virtualization technology has become an important technology to improve the energy efficiency of data centers. Virtual machine placement is the key in the server consolidation technology. In the past few years, many approaches to the virtual machine placement have been proposed. However, existing virtual machine placement approaches consider the energy consumption by physical machines only, but do not consider the energy consumption in communication network, in a data center. However, the energy consumption in the communication network in a data center is not trivial, and therefore should be considered in the virtual machine placement. In our preliminary research, we have proposed a genetic algorithm for a new virtual machine placement problem that considers the energy consumption in both physical machines and the communication network in a data center. Aiming at improving the performance and efficiency of the genetic algorithm, this paper presents a hybrid genetic algorithm for the energy-efficient virtual machine placement problem. Experimental results show that the hybrid genetic algorithm significantly outperforms the original genetic algorithm, and that the hybrid genetic algorithm is scalable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: Four randomized phase II/III trials investigated the addition of cetuximab to platinum-based, first-line chemotherapy in patients with advanced non-small cell lung cancer (NSCLC). A meta-analysis was performed to examine the benefit/risk ratio for the addition of cetuximab to chemotherapy. MATERIALS AND METHODS: The meta-analysis included individual patient efficacy data from 2018 patients and individual patient safety data from 1970 patients comprising respectively the combined intention-to-treat and safety populations of the four trials. The effect of adding cetuximab to chemotherapy was measured by hazard ratios (HRs) obtained using a Cox proportional hazards model and odds ratios calculated by logistic regression. Survival rates at 1 year were calculated. All applied models were stratified by trial. Tests on heterogeneity of treatment effects across the trials and sensitivity analyses were performed for all endpoints. RESULTS: The meta-analysis demonstrated that the addition of cetuximab to chemotherapy significantly improved overall survival (HR 0.88, p=0.009, median 10.3 vs 9.4 months), progression-free survival (HR 0.90, p=0.045, median 4.7 vs 4.5 months) and response (odds ratio 1.46, p<0.001, overall response rate 32.2% vs 24.4%) compared with chemotherapy alone. The safety profile of chemotherapy plus cetuximab in the meta-analysis population was confirmed as manageable. Neither trials nor patient subgroups defined by key baseline characteristics showed significant heterogeneity for any endpoint. CONCLUSION: The addition of cetuximab to platinum-based, first-line chemotherapy for advanced NSCLC significantly improved outcome for all efficacy endpoints with an acceptable safety profile, indicating a favorable benefit/risk ratio.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern health information systems can generate several exabytes of patient data, the so called "Health Big Data", per year. Many health managers and experts believe that with the data, it is possible to easily discover useful knowledge to improve health policies, increase patient safety and eliminate redundancies and unnecessary costs. The objective of this paper is to discuss the characteristics of Health Big Data as well as the challenges and solutions for health Big Data Analytics (BDA) – the process of extracting knowledge from sets of Health Big Data – and to design and evaluate a pipelined framework for use as a guideline/reference in health BDA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper uses innovative content analysis techniques to map how the death of Oscar Pistorius' girlfriend, Reeva Steenkamp, was framed on Twitter conversations. Around 1.5 million posts from a two-week timeframe are analyzed with a combination of syntactic and semantic methods. This analysis is grounded in the frame analysis perspective and is different than sentiment analysis. Instead of looking for explicit evaluations, such as “he is guilty” or “he is innocent”, we showcase through the results how opinions can be identified by complex articulations of more implicit symbolic devices such as examples and metaphors repeatedly mentioned. Different frames are adopted by users as more information about the case is revealed: from a more episodic one, highly used in the very beginning, to more systemic approaches, highlighting the association of the event with urban violence, gun control issues, and violence against women. A detailed timeline of the discussions is provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

After nearly fifteen years of the open access (OA) movement and its hard-fought struggle for a more open scholarly communication system, publishers are realizing that business models can be both open and profitable. Making journal articles available on an OA license is becoming an accepted strategy for maximizing the value of content to both research communities and the businesses that serve them. The first blog in this two-part series celebrating Data Innovation Day looks at the role that data-innovation is playing in the shift to open access for journal articles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent studies have linked the ability of novice (CS1) programmers to read and explain code with their ability to write code. This study extends earlier work by asking CS2 students to explain object-oriented data structures problems that involve recursion. Results show a strong correlation between ability to explain code at an abstract level and performance on code writing and code reading test problems for these object-oriented data structures problems. The authors postulate that there is a common set of skills concerned with reasoning about programs that explains the correlation between writing code and explaining code. The authors suggest that an overly exclusive emphasis on code writing may be detrimental to learning to program. Non-code writing learning activities (e.g., reading and explaining code) are likely to improve student ability to reason about code and, by extension, improve student ability to write code. A judicious mix of code-writing and code-reading activities is recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Road networks are a national critical infrastructure. The road assets need to be monitored and maintained efficiently as their conditions deteriorate over time. The condition of one of such assets, road pavement, plays a major role in the road network maintenance programmes. Pavement conditions depend upon many factors such as pavement types, traffic and environmental conditions. This paper presents a data analytics case study for assessing the factors affecting the pavement deflection values measured by the traffic speed deflectometer (TSD) device. The analytics process includes acquisition and integration of data from multiple sources, data pre-processing, mining useful information from them and utilising data mining outputs for knowledge deployment. Data mining techniques are able to show how TSD outputs vary in different roads, traffic and environmental conditions. The generated data mining models map the TSD outputs to some classes and define correction factors for each class.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quality of data collection methods selected and the integrity of the data collected are integral tot eh success of a study. This chapter focuses on data collection and study validity. After reading the chapter, readers should be able to define types of data collection methods in quantitative research; list advantages and disadvantages of each method; discuss factors related to internal and external validity; critically evaluate data collection methods and discuss the need to operationalise variables of interest for data collection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A spatial process observed over a lattice or a set of irregular regions is usually modeled using a conditionally autoregressive (CAR) model. The neighborhoods within a CAR model are generally formed deterministically using the inter-distances or boundaries between the regions. An extension of CAR model is proposed in this article where the selection of the neighborhood depends on unknown parameter(s). This extension is called a Stochastic Neighborhood CAR (SNCAR) model. The resulting model shows flexibility in accurately estimating covariance structures for data generated from a variety of spatial covariance models. Specific examples are illustrated using data generated from some common spatial covariance functions as well as real data concerning radioactive contamination of the soil in Switzerland after the Chernobyl accident.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Environmental monitoring is becoming critical as human activity and climate change place greater pressures on biodiversity, leading to an increasing need for data to make informed decisions. Acoustic sensors can help collect data across large areas for extended periods making them attractive in environmental monitoring. However, managing and analysing large volumes of environmental acoustic data is a great challenge and is consequently hindering the effective utilization of the big dataset collected. This paper presents an overview of our current techniques for collecting, storing and analysing large volumes of acoustic data efficiently, accurately, and cost-effectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Australian masonry standard allows either prism tests or correction factors based on the block height and mortar thickness to evaluate masonry compressive strength. The correction factor helps the taller units with conventional 10 mm mortar being not disadvantaged due to size effect. In recent times, 2-4 mm thick, high-adhesive mortars and H blocks with only the mid-web shell are used in masonry construction. H blocks and thinner and higher adhesive mortars have renewed interest of the compression behaviour of hollow concrete masonry and hence is revisited in this paper. This paper presents an experimental study carried out to examine the effects of the thickness of mortar joints, the type of mortar adhesives and the presence of web shells in the hollow concrete masonry prisms under axial compression. A non-contact digital image correlation technique was used to measure the deformation of the prisms and was found adequate for the determination of strain fi eld of the loaded face shells subjected to axial compression. It is found that the absence of end web shells lowers the compressive strength and stiffness of the prisms and the thinner and higher adhesive mortars increase the compressive strength and stiffness, while lowering the Poisson's ratio. © Institution of Engineers Australia, 2013.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter addresses data modelling as a means of promoting statistical literacy in the early grades. Consideration is first given to the importance of increasing young children’s exposure to statistical reasoning experiences and how data modelling can be a rich means of doing so. Selected components of data modelling are then reviewed, followed by a report on some findings from the third-year of a three-year longitudinal study across grades one through three.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A variety of sustainable development research efforts and related activities are attempting to reconcile the issues of conserving our natural resources without limiting economic motivation while also improving our social equity and quality of life. Land use/land cover change, occurring on a global scale, is an aggregate of local land use decisions and profoundly impacts our environment. It is therefore the local decision making process that should be the eventual target of many of the ongoing data collection and research efforts which strive toward supporting a sustainable future. Satellite imagery data is a primary source of data upon which to build a core data set for use by researchers in analyzing this global change. A process is necessary to link global change research, utilizing satellite imagery, to the local land use decision making process. One example of this is the NASA-sponsored Regional Data Center (RDC) prototype. The RDC approach is an attempt to integrate science and technology at the community level. The anticipated result of this complex interaction between research and the decision making communities will be realized in the form of long-term benefits to the public.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Historically, it appears that some of the WRCF have survived because i) they lack sufficient quantity of commercially valuable species; ii) they are located in remote or inaccessible areas; or iii) they have been protected as national parks and sanctuaries. Forests will be protected when people who are deciding the fate of forests conclude than the conservation of forests is more beneficial, e.g. generates higher incomes or has cultural or social values, than their clearance. If this is not the case, forests will continue to be cleared and converted. In the future, the WRCF may be protected only by focused attention. The future policy options may include strategies for strong protection measures, the raising of public awareness about the value of forests, and concerted actions for reducing pressure on forest lands by providing alternatives to forest exploitation to meet the growing demands of forest products. Many areas with low population densities offer an opportunity for conservation if appropriate steps are taken now by the national governments and international community. This opportunity must be founded upon the increased public and government awareness that forests have vast importance to the welfare of humans and ecosystems' services such as biodiversity, watershed protection, and carbon balance. Also paramount to this opportunity is the increased scientific understanding of forest dynamics and technical capability to install global observation and assessment systems. High-resolution satellite data such as Landsat 7 and other technologically advanced satellite programs will provide unprecedented monitoring options for governing authorities. Technological innovation can contribute to the way forests are protected. The use of satellite imagery for regular monitoring and Internet for information dissemination provide effective tools for raising worldwide awareness about the significance of forests and intrinsic value of nature.