978 resultados para Identification problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Pain is defined as both a sensory and an emotional experience. Acute postoperative tooth extraction pain is assessed and treated as a physiological (sensory) pain while chronic pain is a biopsychosocial problem. The purpose of this study was to assess whether psychological and social changes Occur in the acute pain state. Methods: A biopsychosocial pain questionnaire was completed by 438 subjects (165 males, 273 females) with acute postoperative pain at 24 hours following the surgical extraction of teeth and compared with 273 subjects (78 males, 195 females) with chronic orofacial pain. Statistical methods used a k-means cluster analysis. Results: Three clusters were identified in the acute pain group: 'unaffected', 'disabled' and 'depressed, anxious and disabled'. Psychosocial effects showed 24.8 per cent feeling 'distress/suffering' and 15.1 per cent 'sad and depressed'. Females reported higher pain intensity and more distress, depression and inadequate medication for pain relief (p

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deals with the problem of Information Systems design for Corporate Management. It shows that the results of applying current approaches to Management Information Systems and Corporate Modelling fully justify a fresh look to the problem. The thesis develops an approach to design based on Cybernetic principles and theories. It looks at Management as an informational process and discusses the relevance of regulation theory to its practice. The work proceeds around the concept of change and its effects on the organization's stability and survival. The idea of looking at organizations as viable systems is discussed and a design to enhance survival capacity is developed. It takes Ashby's theory of adaptation and developments on ultra-stability as a theoretical framework and considering conditions for learning and foresight deduces that a design should include three basic components: A dynamic model of the organization- environment relationships; a method to spot significant changes in the value of the essential variables and in a certain set of parameters; and a Controller able to conceive and change the other two elements and to make choices among alternative policies. Further considerations of the conditions for rapid adaptation in organisms composed of many parts, and the law of Requisite Variety determine that successful adaptive behaviour requires certain functional organization. Beer's model of viable organizations is put in relation to Ashby's theory of adaptation and regulation. The use of the Ultra-stable system as abstract unit of analysis permits developing a rigorous taxonomy of change; it starts distinguishing between change with in behaviour and change of behaviour to complete the classification with organizational change. It relates these changes to the logical categories of learning connecting the topic of Information System design with that of organizational learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of separating structured information representing phenomena of differing natures is considered. A structure is assumed to be independent of the others if can be represented in a complementary subspace. When the concomitant subspaces are well separated the problem is readily solvable by a linear technique. Otherwise, the linear approach fails to correctly discriminate the required information. Hence, a non-extensive approach is proposed. The resulting nonlinear technique is shown to be suitable for dealing with cases that cannot be tackled by the linear one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The airway epithelium is the first point of contact in the lung for inhaled material, including infectious pathogens and particulate matter, and protects against toxicity from these substances by trapping and clearance via the mucociliary escalator, presence of a protective barrier with tight junctions and initiation of a local inflammatory response. The inflammatory response involves recruitment of phagocytic cells to neutralise and remove and invading materials and is oftern modelled using rodents. However, development of valid in vitro airway epithelial models is of great importance due to the restrictions on animal studies for cosmetic compound testing implicit in the 7th amendment to the European Union Cosmetics Directive. Further, rodent innate immune responses have fundamental differences to human. Pulmonary endothelial cells and leukocytes are also involved in the innate response initiated during pulmonary inflammation. Co-culture models of the airways, in particular where epithelial cells are cultured at air liquid interface with the presence of tight junctions and differentiated mucociliary cells, offer a solution to this problem. Ideally validated models will allow for detection of early biomarkers of response to exposure and investigation into inflammatory response during exposure. This thesis describes the approaches taken towards developing an in vitro epithelial/endothelial cell model of the human airways and identification biomarkers of response to exposure to xenobiotics. The model comprised normal human primary microvascular endothelial cells and the bronchial epithelial cell line BEAS-2B or normal human bronchial epithelial cells. BEAS-2B were chosen as their characterisation at air liquid interface is limited but they are robust in culture, thereby predicted to provide a more reliable test system. Proteomics analysis was undertaken on challenged cells to investigate biomarkers of exposure. BEAS-2B morphology was characterised at air liquid interface compared with normal human bronchial epithelial cells. The results indicate that BEAS-2B cells at an air liquid interface form tight junctions as shown by expression of the tight junction protein zonula occludens-1. To this author’s knowledge this is the first time this result has been reported. The inflammatory response of BEAS-2B (measured as secretion of the inflammatory mediators interleukin-8 and -6) air liquid interface mono-cultures to Escherichia coli lipopolysaccharide or particulate matter (fine and ultrafine titanium dioxide) was comparable to published data for epithelial cells. Cells were also exposed to polymers of “commercial interest” which were in the nanoparticle range (and referred to particles hereafter). BEAS-2B mono-cultures showed an increased secretion of inflammatory mediators after challenge. Inclusion of microvascular endothelial cells resulted in protection against LPS- and particle- induced epithelial toxicity, measured as cell viability and inflammatory response, indicating the importance of co-cultures for investigations into toxicity. Two-dimensional proteomic analysis of lysates from particle-challenged cells failed to identify biomarkers of toxicity due to assay interference and experimental variability. Separately, decreased plasma concentrations of serine protease inhibitors, and the negative acute phase proteins transthyretin, histidine-rich glycoprotein and alpha2-HS glycoprotein were identified as potential biomarkers of methyl methacrylate/ethyl methacrylate/butylacrylate treatment in rats.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DNA-binding proteins are crucial for various cellular processes, such as recognition of specific nucleotide, regulation of transcription, and regulation of gene expression. Developing an effective model for identifying DNA-binding proteins is an urgent research problem. Up to now, many methods have been proposed, but most of them focus on only one classifier and cannot make full use of the large number of negative samples to improve predicting performance. This study proposed a predictor called enDNA-Prot for DNA-binding protein identification by employing the ensemble learning technique. Experiential results showed that enDNA-Prot was comparable with DNA-Prot and outperformed DNAbinder and iDNA-Prot with performance improvement in the range of 3.97-9.52% in ACC and 0.08-0.19 in MCC. Furthermore, when the benchmark dataset was expanded with negative samples, the performance of enDNA-Prot outperformed the three existing methods by 2.83-16.63% in terms of ACC and 0.02-0.16 in terms of MCC. It indicated that enDNA-Prot is an effective method for DNA-binding protein identification and expanding training dataset with negative samples can improve its performance. For the convenience of the vast majority of experimental scientists, we developed a user-friendly web-server for enDNA-Prot which is freely accessible to the public. © 2014 Ruifeng Xu et al.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ACM Computing Classification System (1998): I.2.8 , I.2.10, I.5.1, J.2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spamming has been a widespread problem for social networks. In recent years there is an increasing interest in the analysis of anti-spamming for microblogs, such as Twitter. In this paper we present a systematic research on the analysis of spamming in Sina Weibo platform, which is currently a dominant microblogging service provider in China. Our research objectives are to understand the specific spamming behaviors in Sina Weibo and find approaches to identify and block spammers in Sina Weibo based on spamming behavior classifiers. To start with the analysis of spamming behaviors we devise several effective methods to collect a large set of spammer samples, including uses of proactive honeypots and crawlers, keywords based searching and buying spammer samples directly from online merchants. We processed the database associated with these spammer samples and interestingly we found three representative spamming behaviors: Aggressive advertising, repeated duplicate reposting and aggressive following. We extract various features and compare the behaviors of spammers and legitimate users with regard to these features. It is found that spamming behaviors and normal behaviors have distinct characteristics. Based on these findings we design an automatic online spammer identification system. Through tests with real data it is demonstrated that the system can effectively detect the spamming behaviors and identify spammers in Sina Weibo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the paper the identification of the time-dependent blood perfusion coefficient is formulated as an inverse problem. The bio-heat conduction problem is transformed into the classical heat conduction problem. Then the transformed inverse problem is solved using the method of fundamental solutions together with the Tikhonov regularization. Some numerical results are presented in order to demonstrate the accuracy and the stability of the proposed meshless numerical algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation develops a process improvement method for service operations based on the Theory of Constraints (TOC), a management philosophy that has been shown to be effective in manufacturing for decreasing WIP and improving throughput. While TOC has enjoyed much attention and success in the manufacturing arena, its application to services in general has been limited. The contribution to industry and knowledge is a method for improving global performance measures based on TOC principles. The method proposed in this dissertation will be tested using discrete event simulation based on the scenario of the service factory of airline turnaround operations. To evaluate the method, a simulation model of aircraft turn operations of a U.S. based carrier was made and validated using actual data from airline operations. The model was then adjusted to reflect an application of the Theory of Constraints for determining how to deploy the scarce resource of ramp workers. The results indicate that, given slight modifications to TOC terminology and the development of a method for constraint identification, the Theory of Constraints can be applied with success to services. Bottlenecks in services must be defined as those processes for which the process rates and amount of work remaining are such that completing the process will not be possible without an increase in the process rate. The bottleneck ratio is used to determine to what degree a process is a constraint. Simulation results also suggest that redefining performance measures to reflect a global business perspective of reducing costs related to specific flights versus the operational local optimum approach of turning all aircraft quickly results in significant savings to the company. Savings to the annual operating costs of the airline were simulated to equal 30% of possible current expenses for misconnecting passengers with a modest increase in utilization of the workers through a more efficient heuristic of deploying them to the highest priority tasks. This dissertation contributes to the literature on service operations by describing a dynamic, adaptive dispatch approach to manage service factory operations similar to airline turnaround operations using the management philosophy of the Theory of Constraints.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Physical therapy students must apply the relevant information learned in their academic and clinical experience to problem solve in treating patients. I compared the clinical cognitive competence in patient care of second-year masters students enrolled in two different curricular programs: modified problem-based (M P-B; n = 27) and subject-centered (S-C; n = 41). Main features of S-C learning include lecture and demonstration as the major teaching strategies and no exposure to patients or problem solving learning until the sciences (knowledge) have been taught. Comparatively, main features of M P-B learning include case study in small student groups as the main teaching strategy, early and frequent exposure to patients, and knowledge and problem solving skills learned together for each specific case. Basic and clinical orthopedic knowledge was measured with a written test with open-ended items. Problem solving skills were measured with a written case study patient problem test yielding three subscores: assessment, problem identification, and treatment planning. ^ Results indicated that among the demographic and educational characteristics analyzed, there was a significant difference between groups on ethnicity, bachelor degree type, admission GPA, and current GPA, but there was no significant difference on gender, age, possession of a physical therapy assistant license, and GRE score. In addition, the M P-B group achieved a significantly higher adjusted mean score on the orthopedic knowledge test after controlling for GRE scores. The S-C group achieved a significantly higher adjusted mean total score and treatment management subscore on the case study test after controlling for orthopedic knowledge test scores. These findings did not support their respective research hypotheses. There was no significant difference between groups on the assessment and problem identification subscores of the case study test. The integrated M P-B approach promoted superior retention of basic and clinical science knowledge. The results on problem solving skills were mixed. The S-C approach facilitated superior treatment planning skills, but equivalent patient assessment and problem identification skills by emphasizing all equally and exposing the students to more patients with a wider variety of orthopedic physical therapy needs than in the M P-B approach. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a solution to part of the problem of making robotic or semi-robotic digging equipment less dependant on human supervision. A method is described for identifying rocks of a certain size that may affect digging efficiency or require special handling. The process involves three main steps. First, by using range and intensity data from a time-of-flight (TOF) camera, a feature descriptor is used to rank points and separate regions surrounding high scoring points. This allows a wide range of rocks to be recognized because features can represent a whole or just part of a rock. Second, these points are filtered to extract only points thought to belong to the large object. Finally, a check is carried out to verify that the resultant point cloud actually represents a rock. Results are presented from field testing on piles of fragmented rock. Note to Practitioners—This paper presents an algorithm to identify large boulders in a pile of broken rock as a step towards an autonomous mining dig planner. In mining, piles of broken rock can contain large fragments that may need to be specially handled. To assess rock piles for excavation, we make use of a TOF camera that does not rely on external lighting to generate a point cloud of the rock pile. We then segment large boulders from its surface by using a novel feature descriptor and distinguish between real and false boulder candidates. Preliminary field experiments show promising results with the algorithm performing nearly as well as human test subjects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract : Adverse drug reactions (ADRs) are undesirable effects caused after administration of a single dose or prolonged administration of drug or result from the combination of two or more drugs. Idiosyncratic drug reaction (IDR) is an adverse reaction that does not occur in most patients treated with a drug and does not involve the therapeutic effect of the drug. IDRs are unpredictable and often life-threatening. Idiosyncratic reaction is dependent on drug chemical characteristics or individual immunological response. IDRs are a major problem for drug development because they are usually not detected during clinical trials. In this study we focused on IDRs of Nevirapine (NVP), which is a non-nucleoside reverse transcriptase inhibitor used for the treatment of Human Immunodeficiency Virus (HIV) infections. The use of NVP is limited by a relatively high incidence of skin rash. NVP also causes a rash in female Brown Norway (BN) rats, which we use as animal model for this study. Our hypothesis is that idiosyncratic skin reactions associated with NVP treatment are due to post-translational modifications of proteins (e.g., glutathionylation) detectable by MS. The main objective of this study was to identify the proteins that are targeted by a reactive metabolite of Nevirapine in the skin. The specific objectives derived from the general objective were as follow: 1) To implement the click chemistry approach to detect proteins modified by a reactive NVP-Alkyne (NVP-ALK) metabolite. The purpose of using NVP-ALK was to couple it with Biotin using cycloaddition Click Chemistry reaction. 2) To detect protein modification using Western blotting and Mass Spectrometry techniques, which is important to understand the mechanism of NVP induced toxicity. 3) To identify the proteins using MASCOT search engine for protein identification, by comparing obtained spectrum from Mass Spectrometry with theoretical spectrum to find a matching peptide sequence. 4) To test if the drug or drug metabolites can cause harmful effects, as the induction of oxidative stress in cells (via protein glutathionylation). Oxidative stress causes cell damage that mediates signals, which likely induces the immune response. The results showed that Nevirapine is metabolized to a reactive metabolite, which causes protein modification. The extracted protein from the treated BN rats matched 10% of keratin, which implies that keratin was the protein targeted by the NVP-ALK.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Terrestrial remote sensing imagery involves the acquisition of information from the Earth's surface without physical contact with the area under study. Among the remote sensing modalities, hyperspectral imaging has recently emerged as a powerful passive technology. This technology has been widely used in the fields of urban and regional planning, water resource management, environmental monitoring, food safety, counterfeit drugs detection, oil spill and other types of chemical contamination detection, biological hazards prevention, and target detection for military and security purposes [2-9]. Hyperspectral sensors sample the reflected solar radiation from the Earth surface in the portion of the spectrum extending from the visible region through the near-infrared and mid-infrared (wavelengths between 0.3 and 2.5 µm) in hundreds of narrow (of the order of 10 nm) contiguous bands [10]. This high spectral resolution can be used for object detection and for discriminating between different objects based on their spectral xharacteristics [6]. However, this huge spectral resolution yields large amounts of data to be processed. For example, the Airbone Visible/Infrared Imaging Spectrometer (AVIRIS) [11] collects a 512 (along track) X 614 (across track) X 224 (bands) X 12 (bits) data cube in 5 s, corresponding to about 140 MBs. Similar data collection ratios are achieved by other spectrometers [12]. Such huge data volumes put stringent requirements on communications, storage, and processing. The problem of signal sbspace identification of hyperspectral data represents a crucial first step in many hypersctral processing algorithms such as target detection, change detection, classification, and unmixing. The identification of this subspace enables a correct dimensionality reduction (DR) yelding gains in data storage and retrieval and in computational time and complexity. Additionally, DR may also improve algorithms performance since it reduce data dimensionality without losses in the useful signal components. The computation of statistical estimates is a relevant example of the advantages of DR, since the number of samples required to obtain accurate estimates increases drastically with the dimmensionality of the data (Hughes phnomenon) [13].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Telomeres are DNA-protein complexes which cap the ends of eukaryotic linear chromosomes. In normal somatic cells telomeres shorten and become dysfunctional during ageing due to the DNA end replication problem. This leads to activation of signalling pathways that lead to cellular senescence and apoptosis. However, cancer cells typically bypass this barrier to immortalisation in order to proliferate indefinitely. Therefore enhancing our understanding of telomere dysfunction and pathways involved in regulation of the process is essential. However, the pathways involved are highly complex and involve interaction between a wide range of biological processes. Therefore understanding how telomerase dysfunction is regulated is a challenging task and requires a systems biology approach. In this study I have developed a novel methodology for visualisation and analysis of gene lists focusing on the network level rather than individual or small lists of genes. Application of this methodology to an expression data set and a gene methylation data set allowed me to enhance my understanding of the biology underlying a senescence inducing drug and the process of immortalisation respectively. I then used the methodology to compare the effect of genetic background on induction of telomere uncapping. Telomere uncapping was induced in HCT116 WT, p21-/- and p53-/- cells using a viral vector expressing a mutant variant of hTR, the telomerase RNA template. p21-/- cells showed enhanced sensitivity to telomere uncapping. Analysis of a candidate pathway, Mismatch Repair, revealed a role for the process in response to telomere uncapping and that induction of the pathway was p21 dependent. The methodology was then applied to analysis of the telomerase inhibitor GRN163L and synergistic effects of hypoglycaemia with this drug. HCT116 cells were resistant to GRN163L treatment. However, under hypoglycaemic conditions the dose required for ablation of telomerase activity was reduced significantly and telomere shortening was enhanced. Overall this new methodology has allowed our group and collaborators to identify new biology and improve our understanding of processes regulating telomere dysfunction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate estimation of road pavement geometry and layer material properties through the use of proper nondestructive testing and sensor technologies is essential for evaluating pavement’s structural condition and determining options for maintenance and rehabilitation. For these purposes, pavement deflection basins produced by the nondestructive Falling Weight Deflectometer (FWD) test data are commonly used. The nondestructive FWD test drops weights on the pavement to simulate traffic loads and measures the created pavement deflection basins. Backcalculation of pavement geometry and layer properties using FWD deflections is a difficult inverse problem, and the solution with conventional mathematical methods is often challenging due to the ill-posed nature of the problem. In this dissertation, a hybrid algorithm was developed to seek robust and fast solutions to this inverse problem. The algorithm is based on soft computing techniques, mainly Artificial Neural Networks (ANNs) and Genetic Algorithms (GAs) as well as the use of numerical analysis techniques to properly simulate the geomechanical system. A widely used pavement layered analysis program ILLI-PAVE was employed in the analyses of flexible pavements of various pavement types; including full-depth asphalt and conventional flexible pavements, were built on either lime stabilized soils or untreated subgrade. Nonlinear properties of the subgrade soil and the base course aggregate as transportation geomaterials were also considered. A computer program, Soft Computing Based System Identifier or SOFTSYS, was developed. In SOFTSYS, ANNs were used as surrogate models to provide faster solutions of the nonlinear finite element program ILLI-PAVE. The deflections obtained from FWD tests in the field were matched with the predictions obtained from the numerical simulations to develop SOFTSYS models. The solution to the inverse problem for multi-layered pavements is computationally hard to achieve and is often not feasible due to field variability and quality of the collected data. The primary difficulty in the analysis arises from the substantial increase in the degree of non-uniqueness of the mapping from the pavement layer parameters to the FWD deflections. The insensitivity of some layer properties lowered SOFTSYS model performances. Still, SOFTSYS models were shown to work effectively with the synthetic data obtained from ILLI-PAVE finite element solutions. In general, SOFTSYS solutions very closely matched the ILLI-PAVE mechanistic pavement analysis results. For SOFTSYS validation, field collected FWD data were successfully used to predict pavement layer thicknesses and layer moduli of in-service flexible pavements. Some of the very promising SOFTSYS results indicated average absolute errors on the order of 2%, 7%, and 4% for the Hot Mix Asphalt (HMA) thickness estimation of full-depth asphalt pavements, full-depth pavements on lime stabilized soils and conventional flexible pavements, respectively. The field validations of SOFTSYS data also produced meaningful results. The thickness data obtained from Ground Penetrating Radar testing matched reasonably well with predictions from SOFTSYS models. The differences observed in the HMA and lime stabilized soil layer thicknesses observed were attributed to deflection data variability from FWD tests. The backcalculated asphalt concrete layer thickness results matched better in the case of full-depth asphalt flexible pavements built on lime stabilized soils compared to conventional flexible pavements. Overall, SOFTSYS was capable of producing reliable thickness estimates despite the variability of field constructed asphalt layer thicknesses.