5 resultados para call data, paradata, CATI, calling time, call scheduler, random assignment

em DigitalCommons@University of Nebraska - Lincoln


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hundreds of Terabytes of CMS (Compact Muon Solenoid) data are being accumulated for storage day by day at the University of Nebraska-Lincoln, which is one of the eight US CMS Tier-2 sites. Managing this data includes retaining useful CMS data sets and clearing storage space for newly arriving data by deleting less useful data sets. This is an important task that is currently being done manually and it requires a large amount of time. The overall objective of this study was to develop a methodology to help identify the data sets to be deleted when there is a requirement for storage space. CMS data is stored using HDFS (Hadoop Distributed File System). HDFS logs give information regarding file access operations. Hadoop MapReduce was used to feed information in these logs to Support Vector Machines (SVMs), a machine learning algorithm applicable to classification and regression which is used in this Thesis to develop a classifier. Time elapsed in data set classification by this method is dependent on the size of the input HDFS log file since the algorithmic complexities of Hadoop MapReduce algorithms here are O(n). The SVM methodology produces a list of data sets for deletion along with their respective sizes. This methodology was also compared with a heuristic called Retention Cost which was calculated using size of the data set and the time since its last access to help decide how useful a data set is. Accuracies of both were compared by calculating the percentage of data sets predicted for deletion which were accessed at a later instance of time. Our methodology using SVMs proved to be more accurate than using the Retention Cost heuristic. This methodology could be used to solve similar problems involving other large data sets.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Consider a wavelength-routed optical network in which nodes, i.e., multiwave length cross-connect switches (XCSs), are connected by fiber to form an arbitrary physical topology. A new call is admitted into the network if an all-optical lightpath can be established between the call’s source and destination nodes. Wavelength converters are assumed absent in this work.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Wildlife Master (WM) Program in Colorado was modeled after the highly successful Master Gardener volunteer program. In 10 highly populated suburban counties with large rural areas surrounding the Denver Metro Area, Colorado State University (CSU) Cooperative Extension Natural Resources agents train, supervise and manage these volunteers in the identification, referral, and resolution of wildlife damage issues. High quality, research-based training is provided by university faculty and other professionals in public health, animal damage control, wildlife management and animal behavior. Inquiries are responded to mainly via telephone. Calls by concerned residents are forwarded to WMs who provide general information about human-wildlife conflicts and possible ways to resolve complaints. Each volunteer serves a minimum of 14 days on phone duty annually, calling in from a remote location to a voice mail system from which phone messages can be conveniently retrieved. Response time per call is generally less than 24 hours. During 2004, more than 2,000 phone calls, e-mail messages and walk-in requests for assistance were fielded by 100 cooperative extension WMs. Calls fielded by volunteers in one county increased five-fold during the past five years, from 100 calls to over 500 calls annually. Valued at the rate of approximately $18.00 per volunteer hour, the leveraged value of each WM was about $450 in 2005, based on 25 hours of service and training. The estimated value of the program to Colorado in 2004 was over $45,000 of in-kind service, or about one full-time equivalent faculty member. This paper describes components of Colorado’s WM Program, with guides to the set-up of similar programs in other states.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

One problem with using component-based software development approach is that once software modules are reused over generations of products, they form legacy structures that can be challenging to understand, making validating these systems difficult. Therefore, tools and methodologies that enable engineers to see interactions of these software modules will enhance their ability to make these software systems more dependable. To address this need, we propose SimSight, a framework to capture dynamic call graphs in Simics, a widely adopted commercial full-system simulator. Simics is a software system that simulates complete computer systems. Thus, it performs nearly identical tasks to a real system but at a much lower speed while providing greater execution observability. We have implemented SimSight to generate dynamic call graphs of statically and dynamically linked functions in x86/Linux environment. A case study illustrates how we can use SimSight to identify sources of software errors. We then evaluate its performance using 12 integer programs from SPEC CPU2006 benchmark suite.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the first paper presented to you today by Dr. Spencer, an expert in the Animal Biology field and an official authority at the same time, you heard about the requirements imposed on a chemical in order to pass the different official hurdles before it ever will be accepted as a proven tool in wildlife management. Many characteristics have to be known and highly sophisticated tests have to be run. In many instances the governmental agency maintains its own screening, testing or analytical programs according to standard procedures. It would be impossible, however, for economic and time reasons to work out all the data necessary for themselves. They, therefore, depend largely on the information furnished by the individual industry which naturally has to be established as conscientiously as possible. This, among other things, Dr. Spencer has made very clear; and this is also what makes quite a few headaches for the individual industry, but I am certainly not speaking only for myself in saying that Industry fully realizes this important role in developing materials for vertebrate control and the responsibilities lying in this. This type of work - better to say cooperative work with the official institutions - is, however, only one part and for the most of it, the smallest part of work which Industry pays to the development of compounds for pest control. It actually refers only to those very few compounds which are known to be effective. But how to get to know about their properties in the first place? How does Industry make the selection from the many thousands of compounds synthesized each year? This, by far, creates the biggest problems, at least from the scientific and technical standpoint. Let us rest here for a short while and think about the possible ways of screening and selecting effective compounds. Basically there are two different ways. One is the empirical way of screening as big a number of compounds as possible under the supposition that with the number of incidences the chances for a "hit" increase, too. You can also call this type of approach the statistical or the analytical one, the mass screening of new, mostly unknown candidate materials. This type of testing can only be performed by a producer of many new materials,that means by big industries. It requires a tremendous investment in personnel, time and equipment and is based on highly simplified but indicative test methods, the results of which would have to be reliable and representative for practical purposes. The other extreme is the intellectual way of theorizing effective chemical configurations. Defenders of this method claim to now or later be able to predict biological effectiveness on the basis of the chemical structure or certain groups in it. Certain pre-experience should be necessary, that means knowledge of the importance of certain molecular requirements, then the detection of new and effective complete molecules is a matter of coordination to be performed by smart people or computers. You can also call this method the synthetical or coordinative method.