905 resultados para Web Mining, Data Mining, User Topic Model, Web User Profiles
Resumo:
In Enterobacteriaceae, the transcriptional regulator AmpR, a member of the LysR family, regulates the expression of a chromosomal β-lactamase AmpC. The regulatory repertoire of AmpR is broader in Pseudomonas aeruginosa, an opportunistic pathogen responsible for numerous acute and chronic infections including cystic fibrosis. Previous studies showed that in addition to regulating ampC, P. aeruginosa AmpR regulates the sigma factor AlgT/U and production of some quorum sensing (QS)-regulated virulence factors. In order to better understand the ampR regulon, the transcriptional profiles generated using DNA microarrays and RNA-Seq of the prototypic P. aeruginosa PAO1 strain with its isogenic ampR deletion mutant, PAO∆ampR were analyzed. Transcriptome analysis demonstrates that the AmpR regulon is much more extensive than previously thought influencing the differential expression of over 500 genes. In addition to regulating resistance to β-lactam antibiotics via AmpC, AmpR also regulates non-β-lactam antibiotic resistance by modulating the MexEF-OprN efflux pump. Virulence mechanisms including biofilm formation, QS-regulated acute virulence, and diverse physiological processes such as oxidative stress response, heat-shock response and iron uptake are AmpR-regulated. Real-time PCR and phenotypic assays confirmed the transcriptome data. Further, Caenorhabditis elegans model demonstrates that a functional AmpR is required for full pathogenicity of P. aeruginosa. AmpR, a member of the core genome, also regulates genes in the regions of genome plasticity that are acquired by horizontal gene transfer. The extensive AmpR regulon included other transcriptional regulators and sigma factors, accounting for the extensive AmpR regulon. Gene expression studies demonstrate AmpR-dependent expression of the QS master regulator LasR that controls expression of many virulence factors. Using a chromosomally tagged AmpR, ChIP-Seq studies show direct AmpR binding to the lasR promoter. The data demonstrates that AmpR functions as a global regulator in P. aeruginosa and is a positive regulator of acute virulence while negatively regulating chronic infection phenotypes. In summary, my dissertation sheds light on the complex regulatory circuit in P. aeruginosa to provide a better understanding of the bacterial response to antibiotics and how the organism coordinately regulates a myriad of virulence factors.
Resumo:
In clinical documents, medical terms are often expressed in multi-word phrases. Traditional topic modelling approaches relying on the “bag-of-words” assumption are not effective in extracting topic themes from clinical documents. This paper proposes to first extract medical phrases using an off-the-shelf tool for medical concept mention extraction, and then train a topic model which takes a hierarchy of Pitman-Yor processes as prior for modelling the generation of phrases of arbitrary length. Experimental results on patients’ discharge summaries show that the proposed approach outperforms the state-of-the-art topical phrase extraction model on both perplexity and topic coherence measure and finds more interpretable topics.
Resumo:
Purpose: Custom cranio-orbital implants have been shown to achieve better performance than their hand-shaped counterparts by restoring skull anatomy more accurately and by reducing surgery time. Designing a custom implant involves reconstructing a model of the patient's skull using their computed tomography (CT) scan. The healthy side of the skull model, contralateral to the damaged region, can then be used to design an implant plan. Designing implants for areas of thin bone, such as the orbits, is challenging due to poor CT resolution of bone structures. This makes preoperative design time-intensive since thin bone structures in CT data must be manually segmented. The objective of this thesis was to research methods to accurately and efficiently design cranio-orbital implant plans, with a focus on the orbits, and to develop software that integrates these methods. Methods: The software consists of modules that use image and surface restoration approaches to enhance both the quality of CT data and the reconstructed model. It enables users to input CT data, and use tools to output a skull model with restored anatomy. The skull model can then be used to design the implant plan. The software was designed using 3D Slicer, an open-source medical visualization platform. It was tested on CT data from thirteen patients. Results: The average time it took to create a skull model with restored anatomy using our software was 0.33 hours ± 0.04 STD. In comparison, the design time of the manual segmentation method took between 3 and 6 hours. To assess the structural accuracy of the reconstructed models, CT data from the thirteen patients was used to compare the models created using our software with those using the manual method. When registering the skull models together, the difference between each set of skulls was found to be 0.4 mm ± 0.16 STD. Conclusions: We have developed a software to design custom cranio-orbital implant plans, with a focus on thin bone structures. The method described decreases design time, and is of similar accuracy to the manual method.
Resumo:
Fully articulated hand tracking promises to enable fundamentally new interactions with virtual and augmented worlds, but the limited accuracy and efficiency of current systems has prevented widespread adoption. Today's dominant paradigm uses machine learning for initialization and recovery followed by iterative model-fitting optimization to achieve a detailed pose fit. We follow this paradigm, but make several changes to the model-fitting, namely using: (1) a more discriminative objective function; (2) a smooth-surface model that provides gradients for non-linear optimization; and (3) joint optimization over both the model pose and the correspondences between observed data points and the model surface. While each of these changes may actually increase the cost per fitting iteration, we find a compensating decrease in the number of iterations. Further, the wide basin of convergence means that fewer starting points are needed for successful model fitting. Our system runs in real-time on CPU only, which frees up the commonly over-burdened GPU for experience designers. The hand tracker is efficient enough to run on low-power devices such as tablets. We can track up to several meters from the camera to provide a large working volume for interaction, even using the noisy data from current-generation depth cameras. Quantitative assessments on standard datasets show that the new approach exceeds the state of the art in accuracy. Qualitative results take the form of live recordings of a range of interactive experiences enabled by this new approach.
Resumo:
Fishing trials with monofilament gill nets and longlines using small hooks were carried out in Algarve waters (southern Portugal) over a one-year period. Four hook sizes of "Mustad" brand, round bent, flatted sea hooks (Quality 2316 DT, numbers 15, 13, 12 and 11) and four mesh sizes of 25, 30, 35 and 40 mm (bar length) monofilament gill nets were used. Commercially valuable sea breams dominated the longline catches while small pelagics were relatively more important in the gill nets. Significant differences in the catch size frequency distributions of the two gears were found for all the most important species caught by both gears (Boops boops, Diplodus bellottii, Diplodus vulgaris, Pagellus acarne, Pagellus erythrinus, Spondyiosoma cantharus, Scomber japonicus and Scorpaena notata), with longlines catching larger fish and a wider size range than nets. Whereas longline catch size frequency distributions for most species for the different hook sizes were generally highly overlapped, suggesting little or no differences in size selectivity, gill net catch size frequency distributions clearly showed size selection. A variety of models were fitted to the gill net and hook data using the SELECT method, while the parameters of the logistic model were estimated by maximum likelihood for the longline data. The bi-normal model gave the best fits for most of the species caught with gill nets, while the logistic model adequately described hook selectivity. The results of this study show that the two static gears compete for many of the same species and have different impacts in terms of catch composition and size selectivity. This information will I;e useful for the improved management of these small-scale fisheries in which many different gears compete for scarce resources.
Resumo:
Introduction: Brazil, is one of the main agricultural producers in the world ranking 1st in the production of sugarcane, coffee and oranges. It is also 2nd as world producer of soybeans and a leader in the harvested yields of many other crops. The annual consumption of mineral fertilizers exceeds 20 million mt, 30% of which corresponds to potash fertilizers (ANDA, 2006). From this statistic it may be supposed that fertilizer application in Brazil is rather high, compared with many other countries. However, even if it is assumed that only one fourth of this enormous 8.5 million km2 territory is used for agriculture, average levels of fertilizer application per hectare of arable land are not high enough for sustainable production. One of the major constraints is the relatively low natural fertility status of the soils which contain excessive Fe and Al oxides. Agriculture is also often practised on sandy soils so that the heavy rainfall causes large losses of nutrients through leaching. In general, nutrient removal by crops such as sugarcane and tropical fruits is much more than the average nutrient application via fertilization, especially in regions with a long history of agricultural production. In the recently developed areas, especially in the Cerrado (Brazilian savanna) where agriculture has expanded since 1980, soils are even poorer than in the "old" agricultural regions, and high costs of mineral fertilizers have become a significant input factor in determining soybean, maize and cotton planting. The consumption of mineral fertilizers throughout Brazil is very uneven. According to the 1995/96 Agricultural Census, only in eight of the total of 26 Brazilian states, were 50 per cent or more of the farms treated "systematically" with mineral fertilizers; in many states it was less than 25 per cent, and in five states even less than 12 per cent (Brazilian Institute for Geography and Statistics; Censo Agropecuario1995/96, Instituto Brazileiro de Geografia e Estadistica; IBGE, www.ibge.gov.br). The geographical application distribution pattern of mineral fertilizers may be considered as an important field of research. Understanding geographical disparities in fertilization level requires a complex approach. This includes evaluation of the availability of nutrients in the soil (and related soil properties e.g. CEC and texture), the input of nutrients with fertilizer application, and the removal of nutrients by harvested yields. When all these data are compiled, it is possible to evaluate the balance of particular nutrients for certain areas, and make conclusions as to where agricultural practices should be optimized. This kind of research is somewhat complicated, because it relies on completely different sources of data, usually from incomparable data sources, e.g. soil characteristics attributed to soil type areas, in contrast to yields by administrative regions, or farms. A priority tool in this case is the Geographical Information System (GIS), which enables attribution of data from different fields to the same territorial units, and makes possible integration of these data in an "inputoutput" model, where "input" is the natural availability of a nutrient in the soil plus fertilization, and "output" export of the same nutrient with the removed harvested yield.
Resumo:
A clear sky solar spectral model which describes the irradiation flux has been tested experimentally in Heredia, Costa Rica. A description of the model and comparisons with radiation data are presented. The model computes spectral fluxes of direct, diffuse and global solar irradiation incident on a horizontal surface. Necessary inputs include latitude, altitude, and surface albedo as characteristics of a location as well as the atmospheric characteristics: turbidity, precipitable water vapor, and total ozone content. The results evidence a satisfactory agreement.
Resumo:
A High-Performance Computing job dispatcher is a critical software that assigns the finite computing resources to submitted jobs. This resource assignment over time is known as the on-line job dispatching problem in HPC systems. The fact the problem is on-line means that solutions must be computed in real-time, and their required time cannot exceed some threshold to do not affect the normal system functioning. In addition, a job dispatcher must deal with a lot of uncertainty: submission times, the number of requested resources, and duration of jobs. Heuristic-based techniques have been broadly used in HPC systems, at the cost of achieving (sub-)optimal solutions in a short time. However, the scheduling and resource allocation components are separated, thus generates a decoupled decision that may cause a performance loss. Optimization-based techniques are less used for this problem, although they can significantly improve the performance of HPC systems at the expense of higher computation time. Nowadays, HPC systems are being used for modern applications, such as big data analytics and predictive model building, that employ, in general, many short jobs. However, this information is unknown at dispatching time, and job dispatchers need to process large numbers of them quickly while ensuring high Quality-of-Service (QoS) levels. Constraint Programming (CP) has been shown to be an effective approach to tackle job dispatching problems. However, state-of-the-art CP-based job dispatchers are unable to satisfy the challenges of on-line dispatching, such as generate dispatching decisions in a brief period and integrate current and past information of the housing system. Given the previous reasons, we propose CP-based dispatchers that are more suitable for HPC systems running modern applications, generating on-line dispatching decisions in a proper time and are able to make effective use of job duration predictions to improve QoS levels, especially for workloads dominated by short jobs.
Resumo:
To distinguish the components of NMR signals from hydrated materials and to monitor their evolution after the addition of water to the powders, during the first two days of hydration. To implement the 3 Tau Model in a MATLAB script, called 3TM, provided with a Graphical User Interface (GUI), to easily use the 3 Tau Model with NMRD profiles. The 3 Tau Model, developed a few years ago is used for interpreting the dispersion (NMRD profiles, dependence on the Larmor frequency) of the longitudinal relaxation times, for liquids confined in porous media. This model describes the molecular dynamics of confined molecules by introducing three characteristic correlation times and additional outputs.
Resumo:
The great challenges of today pose great pressure on the food chain to provide safe and nutritious food that meets regulations and consumer health standards. In this context, Risk Analysis is used to produce an estimate of the risks to human health and to identify and implement effective risk-control measures. The aims of this work were 1) describe how QRA is used to evaluate the risk for consumers health, 2) address the methodology to obtain models to apply in QMRA; 3) evaluate solutions to mitigate the risk. The application of a QCRA to the Italian milk industry enabled the assessment of Aflatoxin M1 exposure, impact on different population categories, and comparison of risk-mitigation strategies. The results highlighted the most sensitive population categories, and how more stringent sampling plans reduced risk. The application of a QMRA to Spanish fresh cheeses evidenced how the contamination of this product with Listeria monocytogenes may generate a risk for the consumers. Two risk-mitigation actions were evaluated, i.e. reducing shelf life and domestic refrigerator temperature, both resulting effective in reducing the risk of listeriosis. A description of the most applied protocols for data generation for predictive model development, was provided to increase transparency and reproducibility and to provide the means to better QMRA. The development of a linear regression model describing the fate of Salmonella spp. in Italian salami during the production process and HPP was described. Alkaline electrolyzed water was evaluated for its potential use to reduce microbial loads on working surfaces, with results showing its effectiveness. This work showed the relevance of QRA, of predictive microbiology, and of new technologies to ensure food safety on a more integrated way. Filling of data gaps, the development of better models and the inclusion of new risk-mitigation strategies may lead to improvements in the presented QRAs.
Resumo:
Currently making digital 3D models and replicas of the cultural heritage assets play an important role in the preservation and having a high detail source for future research and intervention. In this dissertation, it is tried to assess different methods for digital surveying and making 3D replicas of cultural heritage assets in different scales of size. The methodologies vary in devices, software, workflow, and the amount of skill that is required. The three phases of the 3D modelling process are data acquisition, modelling, and model presentation. Each of these sections is divided into sub-sections and there are several approaches, methods, devices, and software that may be employed, furthermore, the selection process should be based on the operation's goal, available facilities, the scale and properties of the object or structure to be modeled, as well as the operators' expertise and experience. The most key point to remember is that the 3D modelling operation should be properly accurate, precise, and reliable; therefore, there are so many instructions and pieces of advice on how to perform 3D modelling effectively. It is an attempt to compare and evaluate the various ways of each phase in order to explain and demonstrate their differences, benefits, and drawbacks in order to serve as a simple guide for new and/or inexperienced users.
Resumo:
Background: MicroRNA (miR) are a class of small RNAs that regulate gene expression by inhibiting translation of protein encoding transcripts. To evaluate the role of miR in skeletal muscle of swine, global microRNA abundance was measured at specific developmental stages including proliferating satellite cells, three stages of fetal growth, day-old neonate, and the adult. Results: Twelve potential novel miR were detected that did not match previously reported sequences. In addition, a number of miR previously reported to be expressed in mammalian muscle were detected, having a variety of abundance patterns through muscle development. Muscle-specific miR-206 was nearly absent in proliferating satellite cells in culture, but was the highest abundant miR at other time points evaluated. In addition, miR-1 was moderately abundant throughout developmental stages with highest abundance in the adult. In contrast, miR-133 was moderately abundant in adult muscle and either not detectable or lowly abundant throughout fetal and neonate development. Changes in abundance of ubiquitously expressed miR were also observed. MiR-432 abundance was highest at the earliest stage of fetal development tested (60 day-old fetus) and decreased throughout development to the adult. Conversely, miR-24 and miR-27 exhibited greatest abundance in proliferating satellite cells and the adult, while abundance of miR-368, miR-376, and miR-423-5p was greatest in the neonate. Conclusion: These data present a complete set of transcriptome profiles to evaluate miR abundance at specific stages of skeletal muscle growth in swine. Identification of these miR provides an initial group of miR that may play a vital role in muscle development and growth.
Resumo:
This paper presents a technological viability study of wastewater treatment in an automobile industry by an anaerobic sequencing batch biofilm reactor containing immobilized biomass (AnSBBR) with a draft tube. The reactor was operated in 8-h cycles, with agitation of 400 rpm, at 30 degrees C and treating 2.0 L wastewater per cycle. Initially the efficiency and stability of the reactor were studied when supplied with nutrients and alkalinity. Removal efficiency of 88% was obtained at volumetric loading rate (VLR) of 3.09 mg COD/L day. When VLR was increased to 6.19 mg COD/L day the system presented stable operation with reduction in efficiency of 71%. In a second stage the AnSBBR was operated treating wastewater in natura, i.e., without nutrients supplementation, only with alkalinity, thereby changing feed strategy. The first strategy consisted in feeding 2.0 L batch wise (10 min), the second in feeding 1.0 L of influent batch wise (10 min) and an additional 1.0 L fed-batch wise (4 h), both dewatering 2.0 L of the effluent in 10 min. The third one maintained 1.0 L of treated effluent in the reactor, without discharging, and 1.0 L of influent was fed fed-batch wise (4 h) with dewatering 1.0 L of the effluent in 10 min. For all implemented strategies (VLR of 1.40, 2.57 and 2.61 mg COD/L day) the system presented stability and removal efficiency of approximately 80%. These results show that the AnSBBR presents operational flexibility, as the influent can be fed according to industry availability. In industrial processes this is a considerable advantage, as the influent may be prone to variations. Moreover, for all the investigated conditions the kinetic parameters were obtained from fitting a first-order model to the profiles of organic matter, total volatile acids and methane concentrations. Analysis of the kinetic parameters showed that the best strategy is feeding 1.0 L of influent batchwise (10 min) and 1.0 L fed-batch wise (4 h) in 8-h cycle. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
Objective: Allergic reactions to antibiotics occur in up to 30% of patients with cystic fibrosis (CF). Repeated antibiotic exposure and immune hyper-responsiveness increase the risk of allergic reactions and may limit antibiotic choice. Desensitization may allow the successful administration of an antibiotic despite previous allergy. We aimed to determine the success of antibiotic desensitization in patients with CF in an adult CF unit over a 7-year period. Methodology: A retrospective medical record review was performed on the 19 patients who had undergone antibiotic desensitization procedures. Data collected included drug allergy and intolerance profiles, nature of allergies, and the outcome of desensitization procedures. Desensitization procedures were performed in a ward setting according to published methods. Results: Nineteen patients (13 females) reported 62 drug allergies with a mean of 3.3 per patient. Of the 71 desensitization procedures undergone by this group, 54 (76%) were successful. Fifteen of the 19 patients were allergic to two or more beta-lactam antibiotics. Over half of the patients were desensitized to more than one antibiotic. Nine different antibiotics were used in 31 different patient/drug combinations. A successful outcome was achieved in 18/31 (58%) combinations, with three requiring treatment for mild allergic reactions. Allergic reactions caused drug cessation in a total of 19 patient/drug combinations (three after initial successful desensitization and full courses of antibiotics). Over 50% of these reactions occurred on day 1. Desensitization failures were more common in patients with well-documented allergic reactions to a specific drug. Conclusion: This study demonstrates that multiple antibiotic allergies are common in adults with CE Cross-reactivity between beta-lactam antibiotics may limit antibiotic choice for the treatment of pulmonary exacerbations. Antibiotic desensitization allows safe and successful treatment in the ward setting of many patients with previous allergies to an antibiotic. In many patients symptoms of allergy still occur and result in cessation of the antibiotics. Use of corticosteroids and antihistamines may improve the success rate of desensitization procedures.
Resumo:
As technology is increasingly being seen as a facilitator to learning, open remote laboratories are increasingly available and in widespread use around the world. They provide some advantages over traditional hands-on labs or simulations. This paper presents the results of integrating the open remote laboratory VISIR into several courses, in various contexts and using various methodologies. These integrations, all related to higher education engineering, were designed by teachers with different perspectives to achieve a range of learning outcomes. The degree to which these VISIR-related outcomes were accomplished is discussed. The results reflect the levels of student engagement and learning and of teacher involvement. From the analysis, a connection between these two aspects was traced, although only related to the user profiles. VISIR is shown to be always of benefit for more motivated students, but this benefit can be maximized under particular conditions and characteristics.