889 resultados para call data, paradata, CATI, calling time, call scheduler, random assignment
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Consider a wavelength-routed optical network in which nodes, i.e., multiwave length cross-connect switches (XCSs), are connected by fiber to form an arbitrary physical topology. A new call is admitted into the network if an all-optical lightpath can be established between the call’s source and destination nodes. Wavelength converters are assumed absent in this work.
Resumo:
The Wildlife Master (WM) Program in Colorado was modeled after the highly successful Master Gardener volunteer program. In 10 highly populated suburban counties with large rural areas surrounding the Denver Metro Area, Colorado State University (CSU) Cooperative Extension Natural Resources agents train, supervise and manage these volunteers in the identification, referral, and resolution of wildlife damage issues. High quality, research-based training is provided by university faculty and other professionals in public health, animal damage control, wildlife management and animal behavior. Inquiries are responded to mainly via telephone. Calls by concerned residents are forwarded to WMs who provide general information about human-wildlife conflicts and possible ways to resolve complaints. Each volunteer serves a minimum of 14 days on phone duty annually, calling in from a remote location to a voice mail system from which phone messages can be conveniently retrieved. Response time per call is generally less than 24 hours. During 2004, more than 2,000 phone calls, e-mail messages and walk-in requests for assistance were fielded by 100 cooperative extension WMs. Calls fielded by volunteers in one county increased five-fold during the past five years, from 100 calls to over 500 calls annually. Valued at the rate of approximately $18.00 per volunteer hour, the leveraged value of each WM was about $450 in 2005, based on 25 hours of service and training. The estimated value of the program to Colorado in 2004 was over $45,000 of in-kind service, or about one full-time equivalent faculty member. This paper describes components of Colorado’s WM Program, with guides to the set-up of similar programs in other states.
Resumo:
One problem with using component-based software development approach is that once software modules are reused over generations of products, they form legacy structures that can be challenging to understand, making validating these systems difficult. Therefore, tools and methodologies that enable engineers to see interactions of these software modules will enhance their ability to make these software systems more dependable. To address this need, we propose SimSight, a framework to capture dynamic call graphs in Simics, a widely adopted commercial full-system simulator. Simics is a software system that simulates complete computer systems. Thus, it performs nearly identical tasks to a real system but at a much lower speed while providing greater execution observability. We have implemented SimSight to generate dynamic call graphs of statically and dynamically linked functions in x86/Linux environment. A case study illustrates how we can use SimSight to identify sources of software errors. We then evaluate its performance using 12 integer programs from SPEC CPU2006 benchmark suite.
Resumo:
In the first paper presented to you today by Dr. Spencer, an expert in the Animal Biology field and an official authority at the same time, you heard about the requirements imposed on a chemical in order to pass the different official hurdles before it ever will be accepted as a proven tool in wildlife management. Many characteristics have to be known and highly sophisticated tests have to be run. In many instances the governmental agency maintains its own screening, testing or analytical programs according to standard procedures. It would be impossible, however, for economic and time reasons to work out all the data necessary for themselves. They, therefore, depend largely on the information furnished by the individual industry which naturally has to be established as conscientiously as possible. This, among other things, Dr. Spencer has made very clear; and this is also what makes quite a few headaches for the individual industry, but I am certainly not speaking only for myself in saying that Industry fully realizes this important role in developing materials for vertebrate control and the responsibilities lying in this. This type of work - better to say cooperative work with the official institutions - is, however, only one part and for the most of it, the smallest part of work which Industry pays to the development of compounds for pest control. It actually refers only to those very few compounds which are known to be effective. But how to get to know about their properties in the first place? How does Industry make the selection from the many thousands of compounds synthesized each year? This, by far, creates the biggest problems, at least from the scientific and technical standpoint. Let us rest here for a short while and think about the possible ways of screening and selecting effective compounds. Basically there are two different ways. One is the empirical way of screening as big a number of compounds as possible under the supposition that with the number of incidences the chances for a "hit" increase, too. You can also call this type of approach the statistical or the analytical one, the mass screening of new, mostly unknown candidate materials. This type of testing can only be performed by a producer of many new materials,that means by big industries. It requires a tremendous investment in personnel, time and equipment and is based on highly simplified but indicative test methods, the results of which would have to be reliable and representative for practical purposes. The other extreme is the intellectual way of theorizing effective chemical configurations. Defenders of this method claim to now or later be able to predict biological effectiveness on the basis of the chemical structure or certain groups in it. Certain pre-experience should be necessary, that means knowledge of the importance of certain molecular requirements, then the detection of new and effective complete molecules is a matter of coordination to be performed by smart people or computers. You can also call this method the synthetical or coordinative method.
Resumo:
Background: To optimize patient functioning, rehabilitation professionals often rely on measurements of functioning as well as on classifications. Although the International Classification of Diseases (ICD) and the International Classification of Functioning, Disability and Health (ICF) are used, their joint use has yet to become an established practice. To encourage their joint use in daily practice, the World Health Organization (WHO) has invited all rehabilitation practitioners worldwide to support the ICD-11 revision process by identifying the ICF categories that correspond to specific rehabilitation-relevant health conditions. The first step in completing this task, generating the list of these health conditions, was taken at a February 2012 workshop in Sao Paulo, Brazil. Objectives: The objectives of this paper are to present the results of the Sao Paulo workshop, and to invite practitioners to participate in the ICD-ICF joint use initiative. Discussion: Alternating plenary and small working group sessions were held and 103 rehabilitation-relevant health conditions were identified. With this list available, WHO together with the International Society of Physical and Rehabilitation Medicine (ISPRM), is reaching out to clinicians of all rehabilitation disciplines to take on the challenge of identifying the ICF categories for at least one of the health conditions listed.
Resumo:
A new species of Pseudopaludicola is described from the Cerrado of southeastern Brazil. The new taxon is diagnosed from the P. pusilla species group by the absence of either T-shaped terminal phalanges or toe tips expanded, and promptly distinguished from all (13) recognized taxa currently assigned to Pseudopaludicola by possessing isolated (instead of regular call series), long (117-187 ms) and non-pulsed advertisement calls.
Resumo:
Complexity in time series is an intriguing feature of living dynamical systems, with potential use for identification of system state. Although various methods have been proposed for measuring physiologic complexity, uncorrelated time series are often assigned high values of complexity, errouneously classifying them as a complex physiological signals. Here, we propose and discuss a method for complex system analysis based on generalized statistical formalism and surrogate time series. Sample entropy (SampEn) was rewritten inspired in Tsallis generalized entropy, as function of q parameter (qSampEn). qSDiff curves were calculated, which consist of differences between original and surrogate series qSampEn. We evaluated qSDiff for 125 real heart rate variability (HRV) dynamics, divided into groups of 70 healthy, 44 congestive heart failure (CHF), and 11 atrial fibrillation (AF) subjects, and for simulated series of stochastic and chaotic process. The evaluations showed that, for nonperiodic signals, qSDiff curves have a maximum point (qSDiff(max)) for q not equal 1. Values of q where the maximum point occurs and where qSDiff is zero were also evaluated. Only qSDiff(max) values were capable of distinguish HRV groups (p-values 5.10 x 10(-3); 1.11 x 10(-7), and 5.50 x 10(-7) for healthy vs. CHF, healthy vs. AF, and CHF vs. AF, respectively), consistently with the concept of physiologic complexity, and suggests a potential use for chaotic system analysis. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4758815]
Resumo:
This article is the first part of an on-going ergonomic work analysis with the emergency services call center set up by the Fire Department of the Military Police of Sao Paulo. The final objective of the research is to identify the prescribed task, the real work executed and strategies used by workers to meet the demands of the job. Starting by identifying the tasks and activities developed, this article analyzes the work of the emergency services call center which is of vital importance to the organizational structure, since it is the start point for the process that results in fulfilling the corporation's mission.
Resumo:
We describe a new species of the Bokermannohyla circumdata group from the Estacao de Pesquisa e Desenvolvimento Ambiental Galheiro (EPDA-Galheiro) (19 degrees 12'S; 47 degrees 08'W), Municipality of Perdizes, State of Minas Gerais, a mid-altitudinal (similar or equal to 850 m above sea level) riparian forest environment in the Cerrado of southeastern Brazil. Bokermannohyla napolii sp. nov. is allied to the large-sized species of the group, diagnosed on the basis of adult morphology/morphometrics, and mainly vocalizations. Adult specimens of the new species are most closely related to those of B. luctuosa and B. circumdata, but can be differentiated from the former by having distal subarticular tubercle of finger III bifid/divided in males, and finger IV bifid/divided in males and females; and from both B. luctuosa and B. circumdata by a distinctive advertisement call structure. We also provide bioacoustic data on seven other species of the genus, including previously unknown advertisement calls of B. circumdata and B. carvalhoi, and re-description of the advertisement calls of B. luctuosa, B. ibitiguara, B. nanuzae, B. sazimai, and B. hylax.
Resumo:
Abstract Background A popular model for gene regulatory networks is the Boolean network model. In this paper, we propose an algorithm to perform an analysis of gene regulatory interactions using the Boolean network model and time-series data. Actually, the Boolean network is restricted in the sense that only a subset of all possible Boolean functions are considered. We explore some mathematical properties of the restricted Boolean networks in order to avoid the full search approach. The problem is modeled as a Constraint Satisfaction Problem (CSP) and CSP techniques are used to solve it. Results We applied the proposed algorithm in two data sets. First, we used an artificial dataset obtained from a model for the budding yeast cell cycle. The second data set is derived from experiments performed using HeLa cells. The results show that some interactions can be fully or, at least, partially determined under the Boolean model considered. Conclusions The algorithm proposed can be used as a first step for detection of gene/protein interactions. It is able to infer gene relationships from time-series data of gene expression, and this inference process can be aided by a priori knowledge available.
Resumo:
Background: A common approach for time series gene expression data analysis includes the clustering of genes with similar expression patterns throughout time. Clustered gene expression profiles point to the joint contribution of groups of genes to a particular cellular process. However, since genes belong to intricate networks, other features, besides comparable expression patterns, should provide additional information for the identification of functionally similar genes. Results: In this study we perform gene clustering through the identification of Granger causality between and within sets of time series gene expression data. Granger causality is based on the idea that the cause of an event cannot come after its consequence. Conclusions: This kind of analysis can be used as a complementary approach for functional clustering, wherein genes would be clustered not solely based on their expression similarity but on their topological proximity built according to the intensity of Granger causality among them.
Resumo:
Introduction and Objectives: With the population ageing, there is a growing number of people who have several comorbidities and make use of a variety of drugs. These factors lead to a greater predisposition to adverse drug events, as well as to medication errors. The clinical pharmacist is the most indicated health professional to target these issues. The aims of this study were to analyze the profile of medication reconciliation and assess the role of the clinical pharmacist regarding medication adherence. Material and Methods: Prospective observational cohort study conducted from Jan-Mar 2013 at the Surgical Clinic of the University Hospital of the University of Sao Paulo. 117 admitted patients - over the age of 18 years, under continuous medication use and with length of hospitalization up to 120h - were included. Discrepancies were classified as intentional/unintentional and according to their risk to cause harm, and interventions were divided into accepted/not accepted. Medication adherence was measured by Morisky questionnaire. Results and Conclusions: Only 30% of hospital prescriptions showed no discrepancies between the medications that the patient was using at home and those which were being prescribed at the hospital and more than one third of those had the potential to cause moderate discomfort or clinical deterioration. One third of total discrepancies were classified as unintentional. About 90% of the interventions were accepted by the medical staff. In addition, about 63% of patients had poor adherence to drug therapy. The study revealed the importance of the medication reconciliation at patient admission, ensuring greater safety and therapeutic efficacy of the treatment during hospitalization, and orienting the patient at discharge, assuring the therapy safety.
Resumo:
This work proposes a system for classification of industrial steel pieces by means of magnetic nondestructive device. The proposed classification system presents two main stages, online system stage and off-line system stage. In online stage, the system classifies inputs and saves misclassification information in order to perform posterior analyses. In the off-line optimization stage, the topology of a Probabilistic Neural Network is optimized by a Feature Selection algorithm combined with the Probabilistic Neural Network to increase the classification rate. The proposed Feature Selection algorithm searches for the signal spectrogram by combining three basic elements: a Sequential Forward Selection algorithm, a Feature Cluster Grow algorithm with classification rate gradient analysis and a Sequential Backward Selection. Also, a trash-data recycling algorithm is proposed to obtain the optimal feedback samples selected from the misclassified ones.