897 resultados para Decision tree method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Every fall millions of blackbirds come down the Mississippi Flyway to return to their winter roosts in Arkansas, Louisiana, and East Texas. When these roosts are located in urban areas, public pressure makes the more common chemical means of control impractical. A less destructive and more permanent method of control was sought. At Rice University, in Houston, Texas, there has been a blackbird roost of various sizes and durations since 1956. For the past two years we have had the opportunity both to study roosting blackbird biology and experiment with habitat alteration as a control method. This particular report concentrates on the results and interpretation of the tree- trimming program initiated in August 1974. The birds involved are primarily Brown-headed Cowbirds (Molothrus ater), along with Starlings (sturnus vulgaris), Common and Great-tailed Grackles (Quiscalus quiscula and Cassidix mexicanus), Red-winged Blackbirds (Agelaius phoenicus) and Robins (Turdus migratorius). The campus comprises 121 ha and was planted with live oaks (Quercus virginiana) in 1912. These trees retain their foliage throughout the winter and now form a closed canopy over some 5-6 ha. In the 60s and early 70s most of the birds that came to Houston for the winter roosted in a 64-ha woodlot 10 km north of campus. In January 1970, the U.S. Fish and Wildlife Roosting Survey reported one million birds at this site we call the North Loop. Fifteen- thousand birds were estimated at Rice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fungi are disease-causing agents in plants and affect crops of economic importance. One control method is to induce resistance in the host by using biological control with hypovirulent phytopathogenic fungi. Here, we report the detection of a mycovirus in a strain of Colletotrichum gloeosporioides causing anthracnose of cashew tree. The strain C. gloeosporioides URM 4903 was isolated from a cashew tree (Anacardium occidentale) in Igarassu, PE, Brazil. After nucleic acid extraction and electrophoresis, the band corresponding to a possible double-stranded RNA (dsRNA) was purified by cellulose column chromatography. Nine extrachromosomal bands were obtained. Enzymatic digestion with DNAse I and Nuclease S1 had no effect on these bands, indicating their dsRNA nature. Transmission electron microscopic examination of extracts from this strain showed the presence of isometric particles (30-35 nm in diameter). These data strongly suggest the infection of this C. gloeosporioides strain by a dsRNA mycovirus. Once the hypovirulence of this strain is confirmed, the strain may be used for the biological control of cashew anthracnose.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Tuberculosis (TB) remains a public health issue worldwide. The lack of specific clinical symptoms to diagnose TB makes the correct decision to admit patients to respiratory isolation a difficult task for the clinician. Isolation of patients without the disease is common and increases health costs. Decision models for the diagnosis of TB in patients attending hospitals can increase the quality of care and decrease costs, without the risk of hospital transmission. We present a predictive model for predicting pulmonary TB in hospitalized patients in a high prevalence area in order to contribute to a more rational use of isolation rooms without increasing the risk of transmission. Methods: Cross sectional study of patients admitted to CFFH from March 2003 to December 2004. A classification and regression tree (CART) model was generated and validated. The area under the ROC curve (AUC), sensitivity, specificity, positive and negative predictive values were used to evaluate the performance of model. Validation of the model was performed with a different sample of patients admitted to the same hospital from January to December 2005. Results: We studied 290 patients admitted with clinical suspicion of TB. Diagnosis was confirmed in 26.5% of them. Pulmonary TB was present in 83.7% of the patients with TB (62.3% with positive sputum smear) and HIV/AIDS was present in 56.9% of patients. The validated CART model showed sensitivity, specificity, positive predictive value and negative predictive value of 60.00%, 76.16%, 33.33%, and 90.55%, respectively. The AUC was 79.70%. Conclusions: The CART model developed for these hospitalized patients with clinical suspicion of TB had fair to good predictive performance for pulmonary TB. The most important variable for prediction of TB diagnosis was chest radiograph results. Prospective validation is still necessary, but our model offer an alternative for decision making in whether to isolate patients with clinical suspicion of TB in tertiary health facilities in countries with limited resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: We aimed to establish values and parameters using multislice reconstruction in axial computerized tomography (CT) in order to quantify the erosion of the glenoid cavity in cases of shoulder instability. Methods: We studied two groups using CT. Group I had normal subjects and Group II had patients with shoulder instability. We measured values of the vertical segment, the superior horizontal, medial and inferior segments, and also calculated the ratio of the horizontal superior and inferior segments of the glenoid cavity in both normal subjects and those with shoulder instability. These variables were recorded during arthroscopy for cases with shoulder instability. Results The mean values were 40.87 mm, 17.86 mm, 26.50 mm, 22.86 mm and 0.79 for vertical segment, the superior horizontal, medial and inferior segments, and the ratio between horizontal superior and inferior segments of the glenoid cavity respectively, in normal subjects. For subjects with unstable shoulders the mean values were 37.33 mm, 20.83 mm, 23.07 mm and 0.91 respectively. Arthroscopic measurements yielded an inferior segment value of 24.48 mm with a loss of 2.39 mm (17.57%). The ratio between the superior and inferior segments of the glenoid cavity was 0.79. This value can be used as a normative value for evaluating degree of erosion of the anterior border of the glenoid cavity. However, values found using CT should not be used on a comparative basis with values found during arthroscopy. Conclusions: Computerized tomographic measurements of the glenoid cavity yielded reliable values consistent with those in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coastal flooding poses serious threats to coastal areas around the world, billions of dollars in damage to property and infrastructure, and threatens the lives of millions of people. Therefore, disaster management and risk assessment aims at detecting vulnerability and capacities in order to reduce coastal flood disaster risk. In particular, non-specialized researchers, emergency management personnel, and land use planners require an accurate, inexpensive method to determine and map risk associated with storm surge events and long-term sea level rise associated with climate change. This study contributes to the spatially evaluation and mapping of social-economic-environmental vulnerability and risk at sub-national scale through the development of appropriate tools and methods successfully embedded in a Web-GIS Decision Support System. A new set of raster-based models were studied and developed in order to be easily implemented in the Web-GIS framework with the purpose to quickly assess and map flood hazards characteristics, damage and vulnerability in a Multi-criteria approach. The Web-GIS DSS is developed recurring to open source software and programming language and its main peculiarity is to be available and usable by coastal managers and land use planners without requiring high scientific background in hydraulic engineering. The effectiveness of the system in the coastal risk assessment is evaluated trough its application to a real case study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A liquid chromatography tandem mass spectrometry (LC-MS/MS) confirmatory method for the simultaneous determination of nine corticosteroids in liver, including the four MRL compounds listed in Council Regulation 37/2010, was developed. After an enzymatic deconjugation and a solvent extraction of the liver tissue, the resulting solution was cleaned up through an SPE Oasis HLB cartridge. The analytes were then detected by liquid chromatography-negative-ion electrospray tandem mass spectrometry, using deuterium-labelled internal standards. The procedure was validated as a quantitative confirmatory method according to the Commission Decision 2002/657/EC criteria. The results showed that the method was suitable for statutory residue testing regarding the following performance characteristics: instrumental linearity, specificity, precision (repeatability and intra-laboratory reproducibility), recovery, decision limit (CCα), detection capability (CCβ) and ruggedness. All the corticosteroids can be detected at a concentration around 1 μg kg(-1); the recoveries were above 62% for all the analytes. Repeatability and reproducibility (within-laboratory reproducibility) for all the analytes were below 7.65% and 15.5%, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Shared Decision Making (SDM) is widely accepted as the preferred method for reaching treatment decisions in the oncology setting including those about clinical trial participation: however, there is some disagreement between researchers over the components of SDM. Specific standardized coding systems are needed to help overcome this difficulty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To assess the virological outcome of patients with undetectable human immunodeficiency (HI) viremia switched to tenofovir (TDF)-containing nucleosideonly (NUKE-only) treatments and to investigate the factors influencing the physicians' decision for application of a nonestablished therapy. METHOD: Patients' characteristics and history were taken from the cohort database. To study the decision-making process, questionnaires were sent to all treating physicians. RESULTS: 49 patients were changed to TDF-containing NUKE-only treatment and 46 had a follow-up measurement of HI viremia. Virological failure occurred in 16 (35%) patients. Virological failure was associated with previous mono or dual therapy and with a regimen including didanosine or abacavir. No failure occurred in 15 patients without these predisposing factors. The main reasons for change to TDF-containing NUKE-only treatment were side effects and presumed favorable toxicity profile. The rationale behind this decision was mainly analogy to the zidovudine/lamivudine/abacavir maintenance therapy. CONCLUSION: TDF-containing NUKE-only treatment is associated with high early failure rates in patients with previous nucleoside reverse transcriptase inhibitor mono or dual therapy and in drug combinations containing didanosine or abacavir but not in patients without these predisposing factors. In HIV medicine, treatment strategies that are not evidence-based are followed by a minority of experienced physicians and are driven by patients' needs, mainly to minimize treatment side effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multi-input multi-output (MIMO) technology is an emerging solution for high data rate wireless communications. We develop soft-decision based equalization techniques for frequency selective MIMO channels in the quest for low-complexity equalizers with BER performance competitive to that of ML sequence detection. We first propose soft decision equalization (SDE), and demonstrate that decision feedback equalization (DFE) based on soft-decisions, expressed via the posterior probabilities associated with feedback symbols, is able to outperform hard-decision DFE, with a low computational cost that is polynomial in the number of symbols to be recovered, and linear in the signal constellation size. Building upon the probabilistic data association (PDA) multiuser detector, we present two new MIMO equalization solutions to handle the distinctive channel memory. With their low complexity, simple implementations, and impressive near-optimum performance offered by iterative soft-decision processing, the proposed SDE methods are attractive candidates to deliver efficient reception solutions to practical high-capacity MIMO systems. Motivated by the need for low-complexity receiver processing, we further present an alternative low-complexity soft-decision equalization approach for frequency selective MIMO communication systems. With the help of iterative processing, two detection and estimation schemes based on second-order statistics are harmoniously put together to yield a two-part receiver structure: local multiuser detection (MUD) using soft-decision Probabilistic Data Association (PDA) detection, and dynamic noise-interference tracking using Kalman filtering. The proposed Kalman-PDA detector performs local MUD within a sub-block of the received data instead of over the entire data set, to reduce the computational load. At the same time, all the inter-ference affecting the local sub-block, including both multiple access and inter-symbol interference, is properly modeled as the state vector of a linear system, and dynamically tracked by Kalman filtering. Two types of Kalman filters are designed, both of which are able to track an finite impulse response (FIR) MIMO channel of any memory length. The overall algorithms enjoy low complexity that is only polynomial in the number of information-bearing bits to be detected, regardless of the data block size. Furthermore, we introduce two optional performance-enhancing techniques: cross- layer automatic repeat request (ARQ) for uncoded systems and code-aided method for coded systems. We take Kalman-PDA as an example, and show via simulations that both techniques can render error performance that is better than Kalman-PDA alone and competitive to sphere decoding. At last, we consider the case that channel state information (CSI) is not perfectly known to the receiver, and present an iterative channel estimation algorithm. Simulations show that the performance of SDE with channel estimation approaches that of SDE with perfect CSI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the project, managers encounter numerous contingencies and are faced with the challenging task of making decisions that will effectively keep the project on track. This task is very challenging because construction projects are non-prototypical and the processes are irreversible. Therefore, it is critical to apply a methodological approach to develop a few alternative management decision strategies during the planning phase, which can be deployed to manage alternative scenarios resulting from expected and unexpected disruptions in the as-planned schedule. Such a methodology should have the following features but are missing in the existing research: (1) looking at the effects of local decisions on the global project outcomes, (2) studying how a schedule responds to decisions and disruptive events because the risk in a schedule is a function of the decisions made, (3) establishing a method to assess and improve the management decision strategies, and (4) developing project specific decision strategies because each construction project is unique and the lessons from a particular project cannot be easily applied to projects that have different contexts. The objective of this dissertation is to develop a schedule-based simulation framework to design, assess, and improve sequences of decisions for the execution stage. The contribution of this research is the introduction of applying decision strategies to manage a project and the establishment of iterative methodology to continuously assess and improve decision strategies and schedules. The project managers or schedulers can implement the methodology to develop and identify schedules accompanied by suitable decision strategies to manage a project at the planning stage. The developed methodology also lays the foundation for an algorithm towards continuously automatically generating satisfactory schedule and strategies through the construction life of a project. Different from studying isolated daily decisions, the proposed framework introduces the notion of {em decision strategies} to manage construction process. A decision strategy is a sequence of interdependent decisions determined by resource allocation policies such as labor, material, equipment, and space policies. The schedule-based simulation framework consists of two parts, experiment design and result assessment. The core of the experiment design is the establishment of an iterative method to test and improve decision strategies and schedules, which is based on the introduction of decision strategies and the development of a schedule-based simulation testbed. The simulation testbed used is Interactive Construction Decision Making Aid (ICDMA). ICDMA has an emulator to duplicate the construction process that has been previously developed and a random event generator that allows the decision-maker to respond to disruptions in the emulation. It is used to study how the schedule responds to these disruptions and the corresponding decisions made over the duration of the project while accounting for cascading impacts and dependencies between activities. The dissertation is organized into two parts. The first part presents the existing research, identifies the departure points of this work, and develops a schedule-based simulation framework to design, assess, and improve decision strategies. In the second part, the proposed schedule-based simulation framework is applied to investigate specific research problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Disturbances in power systems may lead to electromagnetic transient oscillations due to mismatch of mechanical input power and electrical output power. Out-of-step conditions in power system are common after the disturbances where the continuous oscillations do not damp out and the system becomes unstable. Existing out-of-step detection methods are system specific as extensive off-line studies are required for setting of relays. Most of the existing algorithms also require network reduction techniques to apply in multi-machine power systems. To overcome these issues, this research applies Phasor Measurement Unit (PMU) data and Zubov’s approximation stability boundary method, which is a modification of Lyapunov’s direct method, to develop a novel out-of-step detection algorithm. The proposed out-of-step detection algorithm is tested in a Single Machine Infinite Bus system, IEEE 3-machine 9-bus, and IEEE 10-machine 39-bus systems. Simulation results show that the proposed algorithm is capable of detecting out-of-step conditions in multi-machine power systems without using network reduction techniques and a comparative study with an existing blinder method demonstrate that the decision times are faster. The simulation case studies also demonstrate that the proposed algorithm does not depend on power system parameters, hence it avoids the need of extensive off-line system studies as needed in other algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

VHB-JOURQUAL represents the official journal ranking of the German Academic Association for Business Research. Since its introduction in 2003, the ranking has become the most influential journal evaluation approach in German-speaking countries, impacting several key managerial decisions of German, Austrian, and Swiss business schools. This article reports the methodological approach of the ranking’s second edition. It also presents the main results and additional analyses on the validity of the rating and the underlying decision processes of the respondents. Selected implications for researchers and higher-education institutions are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Identifying and comparing different steady states is an important task for clinical decision making. Data from unequal sources, comprising diverse patient status information, have to be interpreted. In order to compare results an expressive representation is the key. In this contribution we suggest a criterion to calculate a context-sensitive value based on variance analysis and discuss its advantages and limitations referring to a clinical data example obtained during anesthesia. Different drug plasma target levels of the anesthetic propofol were preset to reach and maintain clinically desirable steady state conditions with target controlled infusion (TCI). At the same time systolic blood pressure was monitored, depth of anesthesia was recorded using the bispectral index (BIS) and propofol plasma concentrations were determined in venous blood samples. The presented analysis of variance (ANOVA) is used to quantify how accurately steady states can be monitored and compared using the three methods of measurement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stemmatology, or the reconstruction of the transmission history of texts, is a field that stands particularly to gain from digital methods. Many scholars already take stemmatic approaches that rely heavily on computational analysis of the collated text (e.g. Robinson and O’Hara 1996; Salemans 2000; Heikkilä 2005; Windram et al. 2008 among many others). Although there is great value in computationally assisted stemmatology, providing as it does a reproducible result and allowing access to the relevant methodological process in related fields such as evolutionary biology, computational stemmatics is not without its critics. The current state-of-the-art effectively forces scholars to choose between a preconceived judgment of the significance of textual differences (the Lachmannian or neo-Lachmannian approach, and the weighted phylogenetic approach) or to make no judgment at all (the unweighted phylogenetic approach). Some basis for judgment of the significance of variation is sorely needed for medieval text criticism in particular. By this, we mean that there is a need for a statistical empirical profile of the text-genealogical significance of the different sorts of variation in different sorts of medieval texts. The rules that apply to copies of Greek and Latin classics may not apply to copies of medieval Dutch story collections; the practices of copying authoritative texts such as the Bible will most likely have been different from the practices of copying the Lives of local saints and other commonly adapted texts. It is nevertheless imperative that we have a consistent, flexible, and analytically tractable model for capturing these phenomena of transmission. In this article, we present a computational model that captures most of the phenomena of text variation, and a method for analysis of one or more stemma hypotheses against the variation model. We apply this method to three ‘artificial traditions’ (i.e. texts copied under laboratory conditions by scholars to study the properties of text variation) and four genuine medieval traditions whose transmission history is known or deduced in varying degrees. Although our findings are necessarily limited by the small number of texts at our disposal, we demonstrate here some of the wide variety of calculations that can be made using our model. Certain of our results call sharply into question the utility of excluding ‘trivial’ variation such as orthographic and spelling changes from stemmatic analysis.