819 resultados para Task-based information access
Resumo:
Landowners and agencies have expressed difficulty finding hunters willing to harvest the female portion of the ungulate populations, and likewise, hunters have expressed difficulty achieving access to private lands. Since 2003, the Montana “DoeCowHunt” website (www.doecowhunt.montana.edu) has provided an avenue to improve hunter-landowner contact and wild ungulate population management. A product of Montana State University Extension Wildlife Program, this website provides a means for hunters and landowners in Montana to contact each other by listing contact information (email address, physical address, and telephone number) for the purpose of harvesting antlerless ungulates. In the first year over 10,000 users visited the site. Of those who actually registered, 11 were landowners and 1334 were hunters. An evaluation survey resulted in a 40% response rate. The survey indicated the average registered landowner had 20 hunter contacts. Many landowners contacted hunters through use of the website but did not register or list their contact information on the site.
Resumo:
Observability measures the support of computer systems to accurately capture, analyze, and present (collectively observe) the internal information about the systems. Observability frameworks play important roles for program understanding, troubleshooting, performance diagnosis, and optimizations. However, traditional solutions are either expensive or coarse-grained, consequently compromising their utility in accommodating today’s increasingly complex software systems. New solutions are emerging for VM-based languages due to the full control language VMs have over program executions. Existing such solutions, nonetheless, still lack flexibility, have high overhead, or provide limited context information for developing powerful dynamic analyses. In this thesis, we present a VM-based infrastructure, called marker tracing framework (MTF), to address the deficiencies in the existing solutions for providing better observability for VM-based languages. MTF serves as a solid foundation for implementing fine-grained low-overhead program instrumentation. Specifically, MTF allows analysis clients to: 1) define custom events with rich semantics ; 2) specify precisely the program locations where the events should trigger; and 3) adaptively enable/disable the instrumentation at runtime. In addition, MTF-based analysis clients are more powerful by having access to all information available to the VM. To demonstrate the utility and effectiveness of MTF, we present two analysis clients: 1) dynamic typestate analysis with adaptive online program analysis (AOPA); and 2) selective probabilistic calling context analysis (SPCC). In addition, we evaluate the runtime performance of MTF and the typestate client with the DaCapo benchmarks. The results show that: 1) MTF has acceptable runtime overhead when tracing moderate numbers of marker events; and 2) AOPA is highly effective in reducing the event frequency for the dynamic typestate analysis; and 3) language VMs can be exploited to offer greater observability.
Resumo:
Objective: To analyze drug prescriptions for insulin and oral antidiabetic drugs in type 1 and type 2 diabetes mellitus patients seen in the Brazilian Public Healthcare System (Unified Health System - SUS) in Ribeirao Preto, SP, Brazil. Subjects and methods: All the patients with diabetes seen in the SUS in the western district of Ribeirao Preto, SP, Brazil between March/2006 and February/2007 were included in the study. Results: A total of 3,982 patients were identified. Mean age of the patients was 60.6 years, and 61.0% were females. Sixty percent of the patients were treated with monotherapy. Doses of oral antidiabetic drugs were lower in monotherapy than in polytherapy. Ten patients received doses of glibenclamide or metformin above the recommended maximum doses, and in elderly patients there was no reduction in drug doses. Conclusion: Monotherapy with oral antidiabetic drugs was the predominant procedure, and the doses were not individualized according to age. Arq Bras Endocrinol Metab. 2012;56(2):120-7
Resumo:
We extend the random permutation model to obtain the best linear unbiased estimator of a finite population mean accounting for auxiliary variables under simple random sampling without replacement (SRS) or stratified SRS. The proposed method provides a systematic design-based justification for well-known results involving common estimators derived under minimal assumptions that do not require specification of a functional relationship between the response and the auxiliary variables.
Resumo:
Landfarm soils are employed in industrial and petrochemical residue bioremediation. This process induces selective pressure directed towards microorganisms capable of degrading toxic compounds. Detailed description of taxa in these environments is difficult due to a lack of knowledge of culture conditions required for unknown microorganisms. A metagenomic approach permits identification of organisms without the need for culture. However, a DNA extraction step is first required, which can bias taxonomic representativeness and interfere with cloning steps by extracting interference substances. We developed a simplified DNA extraction procedure coupled with metagenomic DNA amplification in an effort to overcome these limitations. The amplified sequences were used to generate a metagenomic data set and the taxonomic and functional representativeness were evaluated in comparison with a data set built with DNA extracted by conventional methods. The simplified and optimized method of RAPD to access metagenomic information provides better representativeness of the taxonomical and metabolic aspects of the environmental samples.
Resumo:
Nowadays, competitiveness introduces new behaviors and leads companies to a discomforting situation and often to non adaptation to environmental requirements. A growing number of challenges associated with control of information in organizations with engineering activities can be seen, particularly, the growing amount of information subject to continuous changes. The innovative performance of an organization is directly proportional to its ability to manage information. Thus, the importance of information management is recognized by the search for more competent ways to face current demands. The purpose of this article was to analyze informationdependent processes in technology-based companies, through the four major stages of information management. The comparative method of cases and qualitative research were used. The research was conducted in nine technology-based companies which were incubated or recently went through the incubating process at the Technological Park of Sao Carlos, in the state of Sao Paulo. Among the main results, it was found that in graduated companies information management and its procedures were identified as more conscious and structured in contrast to those of the incubated companies.
Resumo:
Ultrasonography has an inherent noise pattern, called speckle, which is known to hamper object recognition for both humans and computers. Speckle noise is produced by the mutual interference of a set of scattered wavefronts. Depending on the phase of the wavefronts, the interference may be constructive or destructive, which results in brighter or darker pixels, respectively. We propose a filter that minimizes noise fluctuation while simultaneously preserving local gray level information. It is based on steps to attenuate the destructive and constructive interference present in ultrasound images. This filter, called interference-based speckle filter followed by anisotropic diffusion (ISFAD), was developed to remove speckle texture from B-mode ultrasound images, while preserving the edges and the gray level of the region. The ISFAD performance was compared with 10 other filters. The evaluation was based on their application to images simulated by Field II (developed by Jensen et al.) and the proposed filter presented the greatest structural similarity, 0.95. Functional improvement of the segmentation task was also measured, comparing rates of true positive, false positive and accuracy. Using three different segmentation techniques, ISFAD also presented the best accuracy rate (greater than 90% for structures with well-defined borders). (E-mail: fernando.okara@gmail.com) (C) 2012 World Federation for Ultrasound in Medicine & Biology.
Resumo:
Walking on irregular surfaces and in the presence of unexpected events is a challenging problem for bipedal machines. Up to date, their ability to cope with gait disturbances is far less successful than humans': Neither trajectory controlled robots, nor dynamic walking machines (Limit CycleWalkers) are able to handle them satisfactorily. On the contrary, humans reject gait perturbations naturally and efficiently relying on their sensory organs that, if needed, elicit a recovery action. A similar approach may be envisioned for bipedal robots and exoskeletons: An algorithm continuously observes the state of the walker and, if an unexpected event happens, triggers an adequate reaction. This paper presents a monitoring algorithm that provides immediate detection of any type of perturbation based solely on a phase representation of the normal walking of the robot. The proposed method was evaluated in a Limit Cycle Walker prototype that suffered push and trip perturbations at different moments of the gait cycle, providing 100% successful detections for the current experimental apparatus and adequately tuned parameters, with no false positives when the robot is walking unperturbed.
Resumo:
Background: There are no available statistical data about sudden cardiac death in Brazil. Therefore, this study has been conducted to evaluate the incidence of sudden cardiac death in our population and its implications. Methods: The research methodology was based on Thurstone's Law of Comparative Judgment, whose premise is that the more an A stimulus differs from a B stimulus, the greater will be the number of people who will perceive this difference. This technique allows an estimation of actual occurrences from subjective perceptions, when compared to official statistics. Data were collected through telephone interviews conducted with Primary and Secondary Care physicians of the Public Health Service in the Metropolitan Area of Sao Paulo (MASP). Results: In the period from October 19, 2009, to October 28, 2009, 196 interviews were conducted. The incidence of 21,270 cases of sudden cardiac death per year was estimated by linear regression analysis of the physicians responses and data from the Mortality Information System of the Brazilian Ministry of Health, with the following correlation and determination coefficients: r = 0.98 and r2= 0.95 (95% confidence interval 0.81.0, P < 0.05). The lack of waiting list for specialized care and socioadministrative problems were considered the main barriers to tertiary care access. Conclusions: The incidence of sudden cardiac death in the MASP is high, and it was estimated as being higher than all other causes of deaths; the extrapolation technique based on the physicians perceptions was validated; and the most important bureaucratic barriers to patient referral to tertiary care have been identified. (PACE 2012; 35:13261331)
Resumo:
The President of Brazil established an Interministerial Work Group in order to “evaluate the model of classification and valuation of disabilities used in Brazil and to define the elaboration and adoption of a unique model for all the country”. Eight Ministries and/or Secretaries participated in the discussion over a period of 10 months, concluding that a proposed model should be based on the United Nations Convention on the Rights of Person with Disabilities, the International Classification of Functioning, Disability and Health, and the ‘support theory’, and organizing a list of recommendations and necessary actions for a Classification, Evaluation and Certification Network with national coverage.
Resumo:
Abstract Background Accurate malaria diagnosis is mandatory for the treatment and management of severe cases. Moreover, individuals with asymptomatic malaria are not usually screened by health care facilities, which further complicates disease control efforts. The present study compared the performances of a malaria rapid diagnosis test (RDT), the thick blood smear method and nested PCR for the diagnosis of symptomatic malaria in the Brazilian Amazon. In addition, an innovative computational approach was tested for the diagnosis of asymptomatic malaria. Methods The study was divided in two parts. For the first part, passive case detection was performed in 311 individuals with malaria-related symptoms from a recently urbanized community in the Brazilian Amazon. A cross-sectional investigation compared the diagnostic performance of the RDT Optimal-IT, nested PCR and light microscopy. The second part of the study involved active case detection of asymptomatic malaria in 380 individuals from riverine communities in Rondônia, Brazil. The performances of microscopy, nested PCR and an expert computational system based on artificial neural networks (MalDANN) using epidemiological data were compared. Results Nested PCR was shown to be the gold standard for diagnosis of both symptomatic and asymptomatic malaria because it detected the major number of cases and presented the maximum specificity. Surprisingly, the RDT was superior to microscopy in the diagnosis of cases with low parasitaemia. Nevertheless, RDT could not discriminate the Plasmodium species in 12 cases of mixed infections (Plasmodium vivax + Plasmodium falciparum). Moreover, the microscopy presented low performance in the detection of asymptomatic cases (61.25% of correct diagnoses). The MalDANN system using epidemiological data was worse that the light microscopy (56% of correct diagnoses). However, when information regarding plasma levels of interleukin-10 and interferon-gamma were inputted, the MalDANN performance sensibly increased (80% correct diagnoses). Conclusions An RDT for malaria diagnosis may find a promising use in the Brazilian Amazon integrating a rational diagnostic approach. Despite the low performance of the MalDANN test using solely epidemiological data, an approach based on neural networks may be feasible in cases where simpler methods for discriminating individuals below and above threshold cytokine levels are available.
Resumo:
Abstract Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.
Resumo:
Rare variants are becoming the new candidates in the search for genetic variants that predispose individuals to a phenotype of interest. Their low prevalence in a population requires the development of dedicated detection and analytical methods. A family-based approach could greatly enhance their detection and interpretation because rare variants are nearly family specific. In this report, we test several distinct approaches for analyzing the information provided by rare and common variants and how they can be effectively used to pinpoint putative candidate genes for follow-up studies. The analyses were performed on the mini-exome data set provided by Genetic Analysis Workshop 17. Eight approaches were tested, four using the trait’s heritability estimates and four using QTDT models. These methods had their sensitivity, specificity, and positive and negative predictive values compared in light of the simulation parameters. Our results highlight important limitations of current methods to deal with rare and common variants, all methods presented a reduced specificity and, consequently, prone to false positive associations. Methods analyzing common variants information showed an enhanced sensibility when compared to rare variants methods. Furthermore, our limited knowledge of the use of biological databases for gene annotations, possibly for use as covariates in regression models, imposes a barrier to further research.
Resumo:
Abstract Background Catching an object is a complex movement that involves not only programming but also effective motor coordination. Such behavior is related to the activation and recruitment of cortical regions that participates in the sensorimotor integration process. This study aimed to elucidate the cortical mechanisms involved in anticipatory actions when performing a task of catching an object in free fall. Methods Quantitative electroencephalography (qEEG) was recorded using a 20-channel EEG system in 20 healthy right-handed participants performed the catching ball task. We used the EEG coherence analysis to investigate subdivisions of alpha (8-12 Hz) and beta (12-30 Hz) bands, which are related to cognitive processing and sensory-motor integration. Results Notwithstanding, we found the main effects for the factor block; for alpha-1, coherence decreased from the first to sixth block, and the opposite effect occurred for alpha-2 and beta-2, with coherence increasing along the blocks. Conclusion It was concluded that to perform successfully our task, which involved anticipatory processes (i.e. feedback mechanisms), subjects exhibited a great involvement of sensory-motor and associative areas, possibly due to organization of information to process visuospatial parameters and further catch the falling object.
Resumo:
The discussions on the future of cataloging has received increased attention in the last ten years, mainly due to the impact of rapid development of information and communication technologies in the same period, which has provided access to the Web anytime, anywhere. These discussions revolve around the need for a new bibliographic framework to meet the demand of this new reality in the digital environment, ie how libraries can process, store, deliver, share and integrate their collections (physical, digital or scanned), in current post-PC era? Faced with this question, Open Access, Open Source and Open Standards are three concepts that need to receive greater attention in the field of Library and Information Science, as it is believed to be fundamental elements for the change of paradigm of descriptive representation, currently based conceptually on physical item rather than intellectual work. This paper aims to raise and discuss such issues and instigate information professionals, especially librarians, to think, discuss and propose initiatives for such problems, contributing and sharing ideas and possible solutions, in multidisciplinary teams. At the end is suggested the effective creation of multidisciplinary and inter-institutional study groups on the future of cataloging and its impact on national collections, in order to contribute to the area of descriptive representation in national and international level