858 resultados para Physics - Computer-assisted instruction
Resumo:
Borrelia burgdorferi is the etiological agent of Lyme disease, the most common tick-borne disease in the United States. Although the most frequently reported symptom is arthritis, patients can also experience severe cardiac, neurologic, and dermatologic abnormalities. The identification of virulence determinants in infectious B. burgdorferi strains has been limited by their slow growth rate, poor transformability, and general lack of genetic tools. The present study demonstrates the use of transposon mutagenesis for the identification of infectivity-related factors in infectious B. burgdorferi, examines the potential role for chemotaxis in mammalian infection, and describes the development of a novel method for the analysis of recombination events at the Ids antigenic variation locus. A pool of Himar1 mutants was isolated using an infectious B. burgdorferi clone and the transposon vector pMarGent. Clones exhibiting reduced infectivity in mice possessed insertions in virulence determinants putatively involved in host survival and dissemination. These results demonstrated the feasibility of extensive transposon mutagenesis studies for the identification of additional infectivity-related factors. mcp-5 mutants were chosen for further study to determine the role of chemotaxis during infection. Animal studies indicated that mcp-5 mutants exhibited a reduced infectivity potential, and suggested a role for mcp-5 during the early stages of infection. An in vitro phenotype for an mcp-5 mutant was not detected. Genetic complementation of an mcp-5 mutant resulted in restoration of Mcp-5 expression in the complemented clone, as demonstrated by western blotting, but the organisms were not infectious in mice. We believe this result is a consequence of differences in expression between genes located on the linear chromosome and genes present on the circular plasmid used for trans-complementation. Overall, this work implicates mcp-5 as an important determinant of mammalian infectivity. Finally, the development of a computer-assisted method for the analysis of recombination events occurring at the B. burgdorferi vls antigenic variation locus has proven highly valuable for the detailed examination of vls gene conversion. The studies described here provide evidence for the importance of chemotaxis during infection in mice and demonstrate advances in both genetic and computational approaches for the further characterization of the Lyme disease spirochete. ^
Resumo:
Background: Hypertension and Diabetes is a public health and economic concern in the United States. The utilization of medical home concepts increases the receipt of preventive services, however, do they also increase adherence to treatments? This study examined the effect of patient-centered medical home technologies such as the electronic health record, clinical support system, and web-based care management in improving health outcomes related to hypertension and diabetes. Methods: A systematic review of the literature used a best evidence synthesis approach to address the general question " Do patient-centered medical home technologies have an effect of diabetes and hypertension treatment?" This was followed by an evaluation of specific examples of the technologies utilized such as computer-assisted recommendations and web-based care management provided by the patient's electronic health record. Ebsco host, Ovid host, and Google Scholar were the databases used to conduct the literature search. Results: The initial search identified over 25 studies based on content and quality that implemented technology interventions to improve communication between provider and patient. After further assessing the articles for risk of bias and study design, 13 randomized controlled studies were chosen. All of the studies chosen were conducted in various primary care settings in both private practices and hospitals between the years 2000 and 2007. The sample sizes of the studies ranged from 42 to 2924 participants. The mean age for all of the studies ranged from 56 to 71 years. The percent women in the studies ranged from one to 78 percent. Over one-third of the studies did not provide the racial composition of the participants. For the seven studies that did provide information about the ethnic composition, 64% of the intervention participants were White. All of the studies utilized some type of web-based or computer-based communication to manage hypertension or diabetes care. Findings on outcomes were mixed, with nine out of 13 studies showing no significant effect on outcomes examined, and four of the studies showing significant and positive impact on health outcomes related to hypertension or diabetes Conclusion: Although the technologies improved patient and provider satisfaction, the outcomes measures such as blood pressure control and glucose control were inconclusive. Further research is needed with diverse ethnic and SES population to investigate the role of patient-centered technologies on hypertension and diabetes control. Also, further research is needed to investigate the effects of innovative medical home technologies that can be used by both patients and providers to increase quality of communication concerning adherence to treatments.^
Resumo:
The association between increases in cerebral glucose metabolism and the development of acidosis is largely inferential, based on reports linking hyperglycemia with poor neurological outcome, lactate accumulation, and the severity of acidosis. We measured local cerebral metabolic rate for glucose (lCMRglc) and an index of brain pH--the acid-base index (ABI)--concurrently and characterized their interaction in a model of focal cerebral ischemia in rats in a double-label autoradiographic study, using ($\sp{14}$C) 2-deoxyglucose and ($\sp{14}$C) dimethyloxazolidinedione. Computer-assisted digitization and analysis permitted the simultaneous quantification of the two variables on a pixel-by-pixel basis in the same brain slices. Hemispheres ipsilateral to tamponade-induced middle cerebral occlusion showed areas of normal, depressed and elevated glucose metabolic rate (as defined by an interhemispheric asymmetry index) after two hours of ischemia. Regions of normal glucose metabolic rate showed normal ABI (pH $\pm$ SD = 6.97 $\pm$ 0.09), regions of depressed lCMRglc showed severe acidosis (6.69 $\pm$ 0.14), and regions of elevated lCMRglc showed moderate acidosis (6.88 $\pm$ 0.10), all significantly different at the.00125 level as shown by analysis of variance. Moderate acidosis in regions of increased lCMRglc suggests that anaerobic glycolysis causes excess protons to be generated by the uncoupling of ATP synthesis and hydrolysis. ^
Resumo:
Gaining valid answers to so-called sensitive questions is an age-old problem in survey research. Various techniques have been developed to guarantee anonymity and minimize the respondent's feelings of jeopardy. Two such techniques are the randomized response technique (RRT) and the unmatched count technique (UCT). In this study we evaluate the effectiveness of different implementations of the RRT (using a forced-response design) in a computer-assisted setting and also compare the use of the RRT to that of the UCT. The techniques are evaluated according to various quality criteria, such as the prevalence estimates they provide, the ease of their use, and respondent trust in the techniques. Our results indicate that the RRTs are problematic with respect to several domains, such as the limited trust they inspire and non-response, and that the RRT estimates are unreliable due to a strong false "no" bias, especially for the more sensitive questions. The UCT, however, performed well compared to the RRTs on all the evaluated measures. The UCT estimates also had more face validity than the RRT estimates. We conclude that the UCT is a promising alternative to RRT in self-administered surveys and that future research should be directed towards evaluating and improving the technique.
Resumo:
Purpose: Surgical simulators are currently essential within any laparoscopic training program because they provide a low-stakes, reproducible and reliable environment to acquire basic skills. The purpose of this study is to determine the training learning curve based on different metrics corresponding to five tasks included in SINERGIA laparoscopic virtual reality simulator. Methods: Thirty medical students without surgical experience participated in the study. Five tasks of SINERGIA were included: Coordination, Navigation, Navigation and touch, Accurate grasping and Coordinated pulling. Each participant was trained in SINERGIA. This training consisted of eight sessions (R1–R8) of the five mentioned tasks and was carried out in two consecutive days with four sessions per day. A statistical analysis was made, and the results of R1, R4 and R8 were pair-wise compared with Wilcoxon signed-rank test. Significance is considered at P value <0.005. Results: In total, 84.38% of the metrics provided by SINERGIA and included in this study show significant differences when comparing R1 and R8. Metrics are mostly improved in the first session of training (75.00% when R1 and R4 are compared vs. 37.50% when R4 and R8 are compared). In tasks Coordination and Navigation and touch, all metrics are improved. On the other hand, Navigation just improves 60% of the analyzed metrics. Most learning curves show an improvement with better results in the fulfillment of the different tasks. Conclusions: Learning curves of metrics that assess the basic psychomotor laparoscopic skills acquired in SINERGIA virtual reality simulator show a faster learning rate during the first part of the training. Nevertheless, eight repetitions of the tasks are not enough to acquire all psychomotor skills that can be trained in SINERGIA. Therefore, and based on these results together with previous works, SINERGIA could be used as training tool with a properly designed training program.
Resumo:
To propose an automated patient-specific algorithm for the creation of accurate and smooth meshes of the aortic anatomy, to be used for evaluating rupture risk factors of abdominal aortic aneurysms (AAA). Finite element (FE) analyses and simulations require meshes to be smooth and anatomically accurate, capturing both the artery wall and the intraluminal thrombus (ILT). The two main difficulties are the modeling of the arterial bifurcations, and of the ILT, which has an arbitrary shape that is conforming to the aortic wall.
Resumo:
The purpose of this work is twofold: first, to develop a process to automatically create parametric models of the aorta that can adapt to any possible intraoperative deformation of the vessel. Second, it intends to provide the tools needed to perform this deformation in real time, by means of a non-rigid registration method. This dynamically deformable model will later be used in a VR-based surgery guidance system for aortic catheterism procedures, showing the vessel changes in real time.
Resumo:
Enhanced learning environments are arising with great success within the field of cognitive skills training in minimally invasive surgery (MIS) because they provides multiple benefits since they avoid time, spatial and cost constraints. TELMA [1,2] is a new technology enhanced learning platform that promotes collaborative and ubiquitous training of surgeons. This platform is based on four main modules: an authoring tool, a learning content and knowledge management system, an evaluation module and a professional network. TELMA has been designed and developed focused on the user; therefore it is necessary to carry out a user validation as final stage of the development. For this purpose, e-MIS validity [3] has been defined. This validation includes usability, contents and functionality validities both for the development and production stages of any e-Learning web platform. Using e-MIS validity, the e-Learning is fully validated since it includes subjective and objective metrics. The purpose of this study is to specify and apply a set of objective and subjective metrics using e-MIS validity to test usability, contents and functionality of TELMA environment within the development stage.
Resumo:
Laparoscopic instrument tracking systems are an essential component in image-guided interventions and offer new possibilities to improve and automate objective assessment methods of surgical skills. In this study we present our system design to apply a third generation optical pose tracker (Micron- Tracker®) to laparoscopic practice. A technical evaluation of this design is performed in order to analyze its accuracy in computing the laparoscopic instrument tip position. Results show a stable fluctuation error over the entire analyzed workspace. The relative position errors are 1.776±1.675 mm, 1.817±1.762 mm, 1.854±1.740 mm, 2.455±2.164 mm, 2.545±2.496 mm, 2.764±2.342 mm, 2.512±2.493 mm for distances of 50, 100, 150, 200, 250, 300, and 350 mm, respectively. The accumulated distance error increases with the measured distance. The instrument inclination covered by the system is high, from 90 to 7.5 degrees. The system reports a low positional accuracy for the instrument tip.
Resumo:
La segmentación de imágenes puede plantearse como un problema de minimización de una energía discreta. Nos enfrentamos así a una doble cuestión: definir una energía cuyo mínimo proporcione la segmentación buscada y, una vez definida la energía, encontrar un mínimo absoluto de la misma. La primera parte de esta tesis aborda el segundo problema, y la segunda parte, en un contexto más aplicado, el primero. Las técnicas de minimización basadas en cortes de grafos permiten obtener el mínimo de una energía discreta en tiempo polinomial mediante algoritmos de tipo min-cut/max-flow. Sin embargo, estas técnicas solo pueden aplicarse a energías que son representabas por grafos. Un importante reto es estudiar qué energías son representabas así como encontrar un grafo que las represente, lo que equivale a encontrar una función gadget con variables adicionales. En la primera parte de este trabajo se estudian propiedades de las funciones gadgets que permiten acotar superiormente el número de variables adicionales. Además se caracterizan las energías con cuatro variables que son representabas, definiendo gadgets con dos variables adicionales. En la segunda parte, más práctica, se aborda el problema de segmentación de imágenes médicas, base en muchas ocasiones para la diagnosis y el seguimiento de terapias. La segmentación multi-atlas es una potente técnica de segmentación automática de imágenes médicas, con tres aspectos importantes a destacar: el tipo de registro entre los atlas y la imagen objetivo, la selección de atlas y el método de fusión de etiquetas. Este último punto puede formularse como un problema de minimización de una energía. A este respecto introducimos dos nuevas energías representables. La primera, de orden dos, se utiliza en la segmentación en hígado y fondo de imágenes abdominales obtenidas mediante tomografía axial computarizada. La segunda, de orden superior, se utiliza en la segmentación en hipocampos y fondo de imágenes cerebrales obtenidas mediante resonancia magnética. ABSTRACT The image segmentation can be described as the problem of minimizing a discrete energy. We face two problems: first, to define an energy whose minimum provides the desired segmentation and, second, once the energy is defined we must find its global minimum. The first part of this thesis addresses the second problem, and the second part, in a more applied context, the first problem. Minimization techniques based on graph cuts find the minimum of a discrete energy in polynomial time via min-cut/max-flow algorithms. Nevertheless, these techniques can only be applied to graph-representable energies. An important challenge is to study which energies are graph-representable and to construct graphs which represent these energies. This is the same as finding a gadget function with additional variables. In the first part there are studied the properties of gadget functions which allow the number of additional variables to be bounded from above. Moreover, the graph-representable energies with four variables are characterised and gadgets with two additional variables are defined for these. The second part addresses the application of these ideas to medical image segmentation. This is often the first step in computer-assisted diagnosis and monitoring therapy. Multiatlas segmentation is a powerful automatic segmentation technique for medical images, with three important aspects that are highlighted here: the registration between the atlas and the target image, the atlas selection, and the label fusion method. We formulate the label fusion method as a minimization problem and we introduce two new graph-representable energies. The first is a second order energy and it is used for the segmentation of the liver in computed tomography (CT) images. The second energy is a higher order energy and it is used for the segmentation of the hippocampus in magnetic resonance images (MRI).
Resumo:
Background: One of the main challenges for biomedical research lies in the computer-assisted integrative study of large and increasingly complex combinations of data in order to understand molecular mechanisms. The preservation of the materials and methods of such computational experiments with clear annotations is essential for understanding an experiment, and this is increasingly recognized in the bioinformatics community. Our assumption is that offering means of digital, structured aggregation and annotation of the objects of an experiment will provide necessary meta-data for a scientist to understand and recreate the results of an experiment. To support this we explored a model for the semantic description of a workflow-centric Research Object (RO), where an RO is defined as a resource that aggregates other resources, e.g., datasets, software, spreadsheets, text, etc. We applied this model to a case study where we analysed human metabolite variation by workflows. Results: We present the application of the workflow-centric RO model for our bioinformatics case study. Three workflows were produced following recently defined Best Practices for workflow design. By modelling the experiment as an RO, we were able to automatically query the experiment and answer questions such as “which particular data was input to a particular workflow to test a particular hypothesis?”, and “which particular conclusions were drawn from a particular workflow?”. Conclusions: Applying a workflow-centric RO model to aggregate and annotate the resources used in a bioinformatics experiment, allowed us to retrieve the conclusions of the experiment in the context of the driving hypothesis, the executed workflows and their input data. The RO model is an extendable reference model that can be used by other systems as well.
Resumo:
Assessment of diastolic chamber properties of the right ventricle by global fitting of pressure-volume data and conformational analysis of 3D + T echocardiographic sequences
Resumo:
La frecuencia con la que se producen explosiones sobre edificios, ya sean accidentales o intencionadas, es reducida, pero sus efectos pueden ser catastróficos. Es deseable poder predecir de forma suficientemente precisa las consecuencias de estas acciones dinámicas sobre edificaciones civiles, entre las cuales las estructuras reticuladas de hormigón armado son una tipología habitual. En esta tesis doctoral se exploran distintas opciones prácticas para el modelado y cálculo numérico por ordenador de estructuras de hormigón armado sometidas a explosiones. Se emplean modelos numéricos de elementos finitos con integración explícita en el tiempo, que demuestran su capacidad efectiva para simular los fenómenos físicos y estructurales de dinámica rápida y altamente no lineales que suceden, pudiendo predecir los daños ocasionados tanto por la propia explosión como por el posible colapso progresivo de la estructura. El trabajo se ha llevado a cabo empleando el código comercial de elementos finitos LS-DYNA (Hallquist, 2006), desarrollando en el mismo distintos tipos de modelos de cálculo que se pueden clasificar en dos tipos principales: 1) modelos basados en elementos finitos de continuo, en los que se discretiza directamente el medio continuo mediante grados de libertad nodales de desplazamientos; 2) modelos basados en elementos finitos estructurales, mediante vigas y láminas, que incluyen hipótesis cinemáticas para elementos lineales o superficiales. Estos modelos se desarrollan y discuten a varios niveles distintos: 1) a nivel del comportamiento de los materiales, 2) a nivel de la respuesta de elementos estructurales tales como columnas, vigas o losas, y 3) a nivel de la respuesta de edificios completos o de partes significativas de los mismos. Se desarrollan modelos de elementos finitos de continuo 3D muy detallados que modelizan el hormigón en masa y el acero de armado de forma segregada. El hormigón se representa con un modelo constitutivo del hormigón CSCM (Murray et al., 2007), que tiene un comportamiento inelástico, con diferente respuesta a tracción y compresión, endurecimiento, daño por fisuración y compresión, y rotura. El acero se representa con un modelo constitutivo elastoplástico bilineal con rotura. Se modeliza la geometría precisa del hormigón mediante elementos finitos de continuo 3D y cada una de las barras de armado mediante elementos finitos tipo viga, con su posición exacta dentro de la masa de hormigón. La malla del modelo se construye mediante la superposición de los elementos de continuo de hormigón y los elementos tipo viga de las armaduras segregadas, que son obligadas a seguir la deformación del sólido en cada punto mediante un algoritmo de penalización, simulando así el comportamiento del hormigón armado. En este trabajo se denominarán a estos modelos simplificadamente como modelos de EF de continuo. Con estos modelos de EF de continuo se analiza la respuesta estructural de elementos constructivos (columnas, losas y pórticos) frente a acciones explosivas. Asimismo se han comparado con resultados experimentales, de ensayos sobre vigas y losas con distintas cargas de explosivo, verificándose una coincidencia aceptable y permitiendo una calibración de los parámetros de cálculo. Sin embargo estos modelos tan detallados no son recomendables para analizar edificios completos, ya que el elevado número de elementos finitos que serían necesarios eleva su coste computacional hasta hacerlos inviables para los recursos de cálculo actuales. Adicionalmente, se desarrollan modelos de elementos finitos estructurales (vigas y láminas) que, con un coste computacional reducido, son capaces de reproducir el comportamiento global de la estructura con una precisión similar. Se modelizan igualmente el hormigón en masa y el acero de armado de forma segregada. El hormigón se representa con el modelo constitutivo del hormigón EC2 (Hallquist et al., 2013), que también presenta un comportamiento inelástico, con diferente respuesta a tracción y compresión, endurecimiento, daño por fisuración y compresión, y rotura, y se usa en elementos finitos tipo lámina. El acero se representa de nuevo con un modelo constitutivo elastoplástico bilineal con rotura, usando elementos finitos tipo viga. Se modeliza una geometría equivalente del hormigón y del armado, y se tiene en cuenta la posición relativa del acero dentro de la masa de hormigón. Las mallas de ambos se unen mediante nodos comunes, produciendo una respuesta conjunta. En este trabajo se denominarán a estos modelos simplificadamente como modelos de EF estructurales. Con estos modelos de EF estructurales se simulan los mismos elementos constructivos que con los modelos de EF de continuo, y comparando sus respuestas estructurales frente a explosión se realiza la calibración de los primeros, de forma que se obtiene un comportamiento estructural similar con un coste computacional reducido. Se comprueba que estos mismos modelos, tanto los modelos de EF de continuo como los modelos de EF estructurales, son precisos también para el análisis del fenómeno de colapso progresivo en una estructura, y que se pueden utilizar para el estudio simultáneo de los daños de una explosión y el posterior colapso. Para ello se incluyen formulaciones que permiten considerar las fuerzas debidas al peso propio, sobrecargas y los contactos de unas partes de la estructura sobre otras. Se validan ambos modelos con un ensayo a escala real en el que un módulo con seis columnas y dos plantas colapsa al eliminar una de sus columnas. El coste computacional del modelo de EF de continuo para la simulación de este ensayo es mucho mayor que el del modelo de EF estructurales, lo cual hace inviable su aplicación en edificios completos, mientras que el modelo de EF estructurales presenta una respuesta global suficientemente precisa con un coste asumible. Por último se utilizan los modelos de EF estructurales para analizar explosiones sobre edificios de varias plantas, y se simulan dos escenarios con cargas explosivas para un edificio completo, con un coste computacional moderado. The frequency of explosions on buildings whether they are intended or accidental is small, but they can have catastrophic effects. Being able to predict in a accurate enough manner the consequences of these dynamic actions on civil buildings, among which frame-type reinforced concrete buildings are a frequent typology is desirable. In this doctoral thesis different practical options for the modeling and computer assisted numerical calculation of reinforced concrete structures submitted to explosions are explored. Numerical finite elements models with explicit time-based integration are employed, demonstrating their effective capacity in the simulation of the occurring fast dynamic and highly nonlinear physical and structural phenomena, allowing to predict the damage caused by the explosion itself as well as by the possible progressive collapse of the structure. The work has been carried out with the commercial finite elements code LS-DYNA (Hallquist, 2006), developing several types of calculation model classified in two main types: 1) Models based in continuum finite elements in which the continuous medium is discretized directly by means of nodal displacement degrees of freedom; 2) Models based on structural finite elements, with beams and shells, including kinematic hypothesis for linear and superficial elements. These models are developed and discussed at different levels: 1) material behaviour, 2) response of structural elements such as columns, beams and slabs, and 3) response of complete buildings or significative parts of them. Very detailed 3D continuum finite element models are developed, modeling mass concrete and reinforcement steel in a segregated manner. Concrete is represented with a constitutive concrete model CSCM (Murray et al., 2007), that has an inelastic behaviour, with different tension and compression response, hardening, cracking and compression damage and failure. The steel is represented with an elastic-plastic bilinear model with failure. The actual geometry of the concrete is modeled with 3D continuum finite elements and every and each of the reinforcing bars with beam-type finite elements, with their exact position in the concrete mass. The mesh of the model is generated by the superposition of the concrete continuum elements and the beam-type elements of the segregated reinforcement, which are made to follow the deformation of the solid in each point by means of a penalty algorithm, reproducing the behaviour of reinforced concrete. In this work these models will be called continuum FE models as a simplification. With these continuum FE models the response of construction elements (columns, slabs and frames) under explosive actions are analysed. They have also been compared with experimental results of tests on beams and slabs with various explosive charges, verifying an acceptable coincidence and allowing a calibration of the calculation parameters. These detailed models are however not advised for the analysis of complete buildings, as the high number of finite elements necessary raises its computational cost, making them unreliable for the current calculation resources. In addition to that, structural finite elements (beams and shells) models are developed, which, while having a reduced computational cost, are able to reproduce the global behaviour of the structure with a similar accuracy. Mass concrete and reinforcing steel are also modeled segregated. Concrete is represented with the concrete constitutive model EC2 (Hallquist et al., 2013), which also presents an inelastic behaviour, with a different tension and compression response, hardening, compression and cracking damage and failure, and is used in shell-type finite elements. Steel is represented once again with an elastic-plastic bilineal with failure constitutive model, using beam-type finite elements. An equivalent geometry of the concrete and the steel is modeled, considering the relative position of the steel inside the concrete mass. The meshes of both sets of elements are bound with common nodes, therefore producing a joint response. These models will be called structural FE models as a simplification. With these structural FE models the same construction elements as with the continuum FE models are simulated, and by comparing their response under explosive actions a calibration of the former is carried out, resulting in a similar response with a reduced computational cost. It is verified that both the continuum FE models and the structural FE models are also accurate for the analysis of the phenomenon of progressive collapse of a structure, and that they can be employed for the simultaneous study of an explosion damage and the resulting collapse. Both models are validated with an experimental full-scale test in which a six column, two floors module collapses after the removal of one of its columns. The computational cost of the continuum FE model for the simulation of this test is a lot higher than that of the structural FE model, making it non-viable for its application to full buildings, while the structural FE model presents a global response accurate enough with an admissible cost. Finally, structural FE models are used to analyze explosions on several story buildings, and two scenarios are simulated with explosive charges for a full building, with a moderate computational cost.
Resumo:
Chemotaxis of Escherichia coli toward phosphotransferase systems (PTSs)–carbohydrates requires phosphoenolpyruvate-dependent PTSs as well as the chemotaxis response regulator CheY and its kinase, CheA. Responses initiated by flash photorelease of a PTS substrates d-glucose and its nonmetabolizable analog methyl α-d-glucopyranoside were measured with 33-ms time resolution using computer-assisted motion analysis. This, together with chemotactic mutants, has allowed us to map out and characterize the PTS chemotactic signal pathway. The responses were absent in mutants lacking the general PTS enzymes EI or HPr, elevated in PTS transport mutants, retarded in mutants lacking CheZ, a catalyst of CheY autodephosphorylation, and severely reduced in mutants with impaired methyl-accepting chemotaxis protein (MCP) signaling activity. Response kinetics were comparable to those triggered by MCP attractant ligands over most of the response range, the most rapid being 11.7 ± 3.1 s−1. The response threshold was <10 nM for glucose. Responses to methyl α-d-glucopyranoside had a higher threshold, commensurate with a lower PTS affinity, but were otherwise kinetically indistinguishable. These facts provide evidence for a single pathway in which the PTS chemotactic signal is relayed rapidly to MCP–CheW–CheA signaling complexes that effect subsequent amplification and slower CheY dephosphorylation. The high sensitivity indicates that this signal is generated by transport-induced dephosphorylation of the PTS rather than phosphoenolpyruvate consumption.
Resumo:
Haptokinetic cell migration across surfaces is mediated by adhesion receptors including β1 integrins and CD44 providing adhesion to extracellular matrix (ECM) ligands such as collagen and hyaluronan (HA), respectively. Little is known, however, about how such different receptor systems synergize for cell migration through three-dimensionally (3-D) interconnected ECM ligands. In highly motile human MV3 melanoma cells, both β1 integrins and CD44 are abundantly expressed, support migration across collagen and HA, respectively, and are deposited upon migration, whereas only β1 integrins but not CD44 redistribute to focal adhesions. In 3-D collagen lattices in the presence or absence of HA and cross-linking chondroitin sulfate, MV3 cell migration and associated functions such as polarization and matrix reorganization were blocked by anti-β1 and anti-α2 integrin mAbs, whereas mAbs blocking CD44, α3, α5, α6, or αv integrins showed no effect. With use of highly sensitive time-lapse videomicroscopy and computer-assisted cell tracking techniques, promigratory functions of CD44 were excluded. 1) Addition of HA did not increase the migratory cell population or its migration velocity, 2) blocking of the HA-binding Hermes-1 epitope did not affect migration, and 3) impaired migration after blocking or activation of β1 integrins was not restored via CD44. Because α2β1-mediated migration was neither synergized nor replaced by CD44–HA interactions, we conclude that the biophysical properties of 3-D multicomponent ECM impose more restricted molecular functions of adhesion receptors, thereby differing from haptokinetic migration across surfaces.