924 resultados para Limited dependent variable regression


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inflammatory breast cancer (IBC) is an extremely rare but highly aggressive form of breast cancer characterized by the rapid development of therapeutic resistance leading to particularly poor survival. Our previous work focused on the elucidation of factors that mediate therapeutic resistance in IBC and identified increased expression of the anti-apoptotic protein, X-linked inhibitor of apoptosis protein (XIAP), to correlate with the development of resistance to chemotherapeutics. Although XIAP is classically thought of as an inhibitor of caspase activation, multiple studies have revealed that XIAP can also function as a signaling intermediate in numerous pathways. Based on preliminary evidence revealing high expression of XIAP in pre-treatment IBC cells rather than only subsequent to the development of resistance, we hypothesized that XIAP could play an important signaling role in IBC pathobiology outside of its heavily published apoptotic inhibition function. Further, based on our discovery of inhibition of chemotherapeutic efficacy, we postulated that XIAP overexpression might also play a role in resistance to other forms of therapy, such as immunotherapy. Finally, we posited that targeting of specific redox adaptive mechanisms, which are observed to be a significant barrier to successful treatment of IBC, could overcome therapeutic resistance and enhance the efficacy of chemo-, radio-, and immuno- therapies. To address these hypotheses our objectives were: 1. to determine a role for XIAP in IBC pathobiology and to elucidate the upstream regulators and downstream effectors of XIAP; 2. to evaluate and describe a role for XIAP in the inhibition of immunotherapy; and 3. to develop and characterize novel redox modulatory strategies that target identified mechanisms to prevent or reverse therapeutic resistance.

Using various genomic and proteomic approaches, combined with analysis of cellular viability, proliferation, and growth parameters both in vitro and in vivo, we demonstrate that XIAP plays a central role in both IBC pathobiology in a manner mostly independent of its role as a caspase-binding protein. Modulation of XIAP expression in cells derived from patients prior to any therapeutic intervention significantly altered key aspects IBC biology including, but not limited to: IBC-specific gene signatures; the tumorigenic capacity of tumor cells; and the metastatic phenotype of IBC, all of which are revealed to functionally hinge on XIAP-mediated NFκB activation, a robust molecular determinant of IBC. Identification of the mechanism of XIAP-mediated NFκB activation led to the characterization of novel peptide-based antagonist which was further used to identify that increased NFκB activation was responsible for redox adaptation previously observed in therapy-resistant IBC cells. Lastly, we describe the targeting of this XIAP-NFκB-ROS axis using a novel redox modulatory strategy both in vitro and in vivo. Together, the data presented here characterize a novel and crucial role for XIAP both in therapeutic resistance and the pathobiology of IBC; these results confirm our previous work in acquired therapeutic resistance and establish the feasibility of targeting XIAP-NFκB and the redox adaptive phenotype of IBC as a means to enhance survival of patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The high frequency of urinary tract infections (UTIs), some of which appear to be endogenous relapses rather than reinfections by new isolates, point to defects in the host's memory immune response. It has been known for many decades that, whereas kidney infections evoked an antibody response to the infecting bacteria, infections limited to the bladder failed to do so. We have identified the existence of a broadly immunosuppressive transcriptional program associated with the bladder, but not the kidneys, during infection of the urinary tract that is dependent on bladder mast cells. This involves the localized secretion of IL-10 and results in the suppression of humoral immune responses in the bladder. Mast cell-mediated immune suppression could suggest a role for these cells in critically balancing the needs to clear infections with the imperative to prevent harmful immune reactions in the host.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

B cell abnormalities contribute to the development and progress of autoimmune disease. Traditionally, the role of B cells in autoimmune disease was thought to be predominantly limited to the production of autoantibodies. Nevertheless, in addition to autoantibody production, B cells have other functions potentially relevant to autoimmunity. Such functions include antigen presentation to and activation of T cells, expression of costimulatory molecules and cytokine production. Recently, the ability of B cells to negatively regulate cellular immune responses and inflammation has been described and the concept of “regulatory B cells” has emerged. A variety of cytokines produced by regulatory B cell subsets have been reported with interleukin-10 (IL-10) being the most studied. IL-10-producing regulatory B cells predominantly localize within a rare CD1dhiCD5+ B cell subset in mice and the CD24hiCD27+ B cell subset in adult humans. This specific IL-10-producing subset of regulatory B cells have been named “B10 cells” to highlight that the regulatory function of these rare B cells is primarily mediated by IL-10, and to distinguish them from other regulatory B cell subsets that regulate immune responses through different mechanisms. B10 cells have been studies in a variety of animal models with autoimmune disease and clinical settings of human autoimmunity. There are many unsolved questions related to B10 cells including their surface phenotype, their origin and development in vivo, and their role in autoimmunity.

In Chapter 3 of this dissertation, the role of the B cell receptor (BCR) in B10 cell development is highlighted. First, the BCR repertoire of mouse peritoneal cavity B10 cells is examined by single cell sequencing; peritoneal cavity B10 cells have clonally diverse germline BCRs that are predominantly unmutated. Second, mouse B10 cells are shown to have higher frequencies of λ+ BCRs compared to non-B10 cells which may indicate the involvement of BCR light chain editing early in the process of B10 cell development in vivo. Third, human peripheral blood B10 cells are examined and are also found to express higher frequencies of λ chains compared to non-b10 cells. Therefore, B10 cell BCRs are clonally diverse and enriched for unmutated germline sequences and λ light chains.

In Chapter 4 of this dissertation, B10 cells are examined in the healthy developing human across the entire age range of infancy, childhood and adolescence, and in a large cohort of children with autoimmunity. The study of B10 cells in the developing human documents a massive transient expansion during middle childhood when up to 30% of blood B cells were competent to produce IL-10. The surface phenotype of pediatric B10 cells was variable and reflective of overall B cell development. B10 cells down-regulated CD4+ T cell interferon-gamma (IFN-γ) production through IL-10-dependent pathways and IFN-γ inhibited whereas interleukin-21 (IL-21) promoted B cell IL-10 competency in vitro. Children with autoimmunity had a contracted B10 cell compartment, along with increased IFN-γ and decreased IL-21 serum levels compared to age-matched healthy controls. The decreased B10 cell frequencies and numbers in children with autoimmunity may be partially explained by the differential regulation of B10 cell development by IFN-γ and IL-21 and alterations in serum cytokine levels. The age-related changes of the B10 cell compartment during normal human development provide new insights into immune tolerance mechanisms involved in inflammation and autoimmunity.

These studies collectively demonstrate that BCR signals are the most important early determinant of B10 cell development in vivo, that human B10 cells are not a surface phenotype defined developmental B cell subset but a functionally defined regulatory B cell subset that regulates CD4+ T IFN-γ production through IL-10-dependent pathways and that human B10 cell development can be regulated by soluble factors in vivo such as the cytokine milieu. The findings of these studies provide new insights into immune tolerance mechanisms involved in human autoimmunity and the potent effects of IL-21 on human B cell IL-10 competence in vitro open new horizons in the development of autologous B10 cell-based therapies as an approach to treat human autoimmune disease in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fitting statistical models is computationally challenging when the sample size or the dimension of the dataset is huge. An attractive approach for down-scaling the problem size is to first partition the dataset into subsets and then fit using distributed algorithms. The dataset can be partitioned either horizontally (in the sample space) or vertically (in the feature space), and the challenge arise in defining an algorithm with low communication, theoretical guarantees and excellent practical performance in general settings. For sample space partitioning, I propose a MEdian Selection Subset AGgregation Estimator ({\em message}) algorithm for solving these issues. The algorithm applies feature selection in parallel for each subset using regularized regression or Bayesian variable selection method, calculates the `median' feature inclusion index, estimates coefficients for the selected features in parallel for each subset, and then averages these estimates. The algorithm is simple, involves very minimal communication, scales efficiently in sample size, and has theoretical guarantees. I provide extensive experiments to show excellent performance in feature selection, estimation, prediction, and computation time relative to usual competitors.

While sample space partitioning is useful in handling datasets with large sample size, feature space partitioning is more effective when the data dimension is high. Existing methods for partitioning features, however, are either vulnerable to high correlations or inefficient in reducing the model dimension. In the thesis, I propose a new embarrassingly parallel framework named {\em DECO} for distributed variable selection and parameter estimation. In {\em DECO}, variables are first partitioned and allocated to m distributed workers. The decorrelated subset data within each worker are then fitted via any algorithm designed for high-dimensional problems. We show that by incorporating the decorrelation step, DECO can achieve consistent variable selection and parameter estimation on each subset with (almost) no assumptions. In addition, the convergence rate is nearly minimax optimal for both sparse and weakly sparse models and does NOT depend on the partition number m. Extensive numerical experiments are provided to illustrate the performance of the new framework.

For datasets with both large sample sizes and high dimensionality, I propose a new "divided-and-conquer" framework {\em DEME} (DECO-message) by leveraging both the {\em DECO} and the {\em message} algorithm. The new framework first partitions the dataset in the sample space into row cubes using {\em message} and then partition the feature space of the cubes using {\em DECO}. This procedure is equivalent to partitioning the original data matrix into multiple small blocks, each with a feasible size that can be stored and fitted in a computer in parallel. The results are then synthezied via the {\em DECO} and {\em message} algorithm in a reverse order to produce the final output. The whole framework is extremely scalable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing nationwide interest in intelligent transportation systems (ITS) and the need for more efficient transportation have led to the expanding use of variable message sign (VMS) technology. VMS panels are substantially heavier than flat panel aluminum signs and have a larger depth (dimension parallel to the direction of traffic). The additional weight and depth can have a significant effect on the aerodynamic forces and inertial loads transmitted to the support structure. The wind induced drag forces and the response of VMS structures is not well understood. Minimum design requirements for VMS structures are contained in the American Association of State Highway Transportation Officials Standard Specification for Structural Support for Highway Signs, Luminaires, and Traffic Signals (AASHTO Specification). However the Specification does not take into account the prismatic geometry of VMS and the complex interaction of the applied aerodynamic forces to the support structure. In view of the lack of code guidance and the limited number research performed so far, targeted experimentation and large scale testing was conducted at the Florida International University (FIU) Wall of Wind (WOW) to provide reliable drag coefficients and investigate the aerodynamic instability of VMS. A comprehensive range of VMS geometries was tested in turbulence representative of the high frequency end of the spectrum in a simulated suburban atmospheric boundary layer. The mean normal, lateral and vertical lift force coefficients, in addition to the twisting moment coefficient and eccentricity ratio, were determined using the measured data for each model. Wind tunnel testing confirmed that drag on a prismatic VMS is smaller than the 1.7 suggested value in the current AASHTO Specification (2013). An alternative to the AASHTO Specification code value is presented in the form of a design matrix. Testing and analysis also indicated that vortex shedding oscillations and galloping instability could be significant for VMS signs with a large depth ratio attached to a structure with a low natural frequency. The effect of corner modification was investigated by testing models with chamfered and rounded corners. Results demonstrated an additional decrease in the drag coefficient but a possible Reynolds number dependency for the rounded corner configuration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to assess the effect of performance feedback on Athletic Trainers’ (ATs) perceived knowledge (PK) and likelihood to pursue continuing education (CE). The investigation was grounded in the theories of “the definition of the situation” (Thomas & Thomas, 1928) and the “illusion of knowing,” (Glenberg, Wilkinson, & Epstein, 1982) suggesting that PK drives behavior. This investigation measured the degree to which knowledge gap predicted CE seeking behavior by providing performance feedback designed to change PK. A pre-test post-test control-group design was used to measure PK and likelihood to pursue CE before and after assessing actual knowledge. ATs (n=103) were randomly sampled and assigned to two groups, with and without performance feedback. Two independent samples t-tests were used to compare groups on the difference scores of the dependent variables. Likelihood to pursue CE was predicted by three variables using multiple linear regression: perceived knowledge, pre-test likelihood to pursue CE, and knowledge gap. There was a 68.4% significant difference (t101= 2.72, p=0.01, ES=0.45) between groups in the change scores for likelihood to pursue CE because of the performance feedback (Experimental group=13.7% increase; Control group= 4.3% increase). The strongest relationship among the dependent variables was between pre-test and post-test measures of likelihood to pursue CE (F2,102=56.80, p<0.01, r=0.73, R2=0.53). The pre- and post-test predictive relationship was enhanced when group was included in the model. In this model [YCEpost=0.76XCEpre-0.34 Xgroup+2.24+E], group accounted for a significant amount of unique variance in predicting CE while the pre-test likelihood to pursue CE variable was held constant (F3,102=40.28, p<0.01,: r=0.74, R2=0.55). Pre-test knowledge gap, regardless of group allocation, was a linear predictor of the likelihood to pursue CE (F1,102=10.90, p=.01, r=.31, R2=.10). In this investigation, performance feedback significantly increased participants’ likelihood to pursue CE. Pre-test knowledge gap was a significant predictor of likelihood to pursue CE, regardless if performance feedback was provided. ATs may have self-assessed and engaged in internal feedback as a result of their test-taking experience. These findings indicate that feedback, both internal and external, may be necessary to trigger CE seeking behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissolution of non-aqueous phase liquids (NAPLs) or gases into groundwater is a key process, both for contamination problems originating from organic liquid sources, and for dissolution trapping in geological storage of CO2. Dissolution in natural systems typically will involve both high and low NAPL saturations and a wide range of pore water flow velocities within the same source zone for dissolution to groundwater. To correctly predict dissolution in such complex systems and as the NAPL saturations change over time, models must be capable of predicting dissolution under a range of saturations and flow conditions. To provide data to test and validate such models, an experiment was conducted in a two-dimensional sand tank, where the dissolution of a spatially variable, 5x5 cm**2 DNAPL tetrachloroethene source was carefully measured using x-ray attenuation techniques at a resolution of 0.2x0.2 cm**2. By continuously measuring the NAPL saturations, the temporal evolution of DNAPL mass loss by dissolution to groundwater could be measured at each pixel. Next, a general dissolution and solute transport code was written and several published rate-limited (RL) dissolution models and a local equilibrium (LE) approach were tested against the experimental data. It was found that none of the models could adequately predict the observed dissolution pattern, particularly in the zones of higher NAPL saturation. Combining these models with a model for NAPL pool dissolution produced qualitatively better agreement with experimental data, but the total matching error was not significantly improved. A sensitivity study of commonly used fitting parameters further showed that several combinations of these parameters could produce equally good fits to the experimental observations. The results indicate that common empirical model formulations for RL dissolution may be inadequate in complex, variable saturation NAPL source zones, and that further model developments and testing is desirable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of the Design by Analysis (DBA) route is a modern trend in pressure vessel and piping international codes in mechanical engineering. However, to apply the DBA to structures under variable mechanical and thermal loads, it is necessary to assure that the plastic collapse modes, alternate plasticity and incremental collapse (with instantaneous plastic collapse as a particular case), be precluded. The tool available to achieve this target is the shakedown theory. Unfortunately, the practical numerical applications of the shakedown theory result in very large nonlinear optimization problems with nonlinear constraints. Precise, robust and efficient algorithms and finite elements to solve this problem in finite dimension has been a more recent achievements. However, to solve real problems in an industrial level, it is necessary also to consider more realistic material properties as well as to accomplish 3D analysis. Limited kinematic hardening, is a typical property of the usual steels and it should be considered in realistic applications. In this paper, a new finite element with internal thermodynamical variables to model kinematic hardening materials is developed and tested. This element is a mixed ten nodes tetrahedron and through an appropriate change of variables is possible to embed it in a shakedown analysis software developed by Zouain and co-workers for elastic ideally-plastic materials, and then use it to perform 3D shakedown analysis in cases with limited kinematic hardening materials

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of the Design by Analysis concept is a trend in modern pressure vessel and piping calculations. DBA flexibility allow us to deal with unexpected configurations detected at in-service inspections. It is also important, in life extension calculations, when deviations of the original standard hypotesis adopted initially in Design by Formula, can happen. To apply the DBA to structures under variable mechanic and thermal loads, it is necessary that, alternate plasticity and incremental collapse (with instantaneous plastic collapse as a particular case), be precluded. These are two basic failure modes considered by ASME or European Standards in DBA. The shakedown theory is the tool available to achieve this goal. In order to apply it, is necessary only the range of the variable loads and the material properties. Precise, robust and efficient algorithms to solve the very large nonlinear optimization problems generated in numerical applications of the shakedown theory is a recent achievement. Zouain and co-workers developed one of these algorithms for elastic ideally-plastic materials. But, it is necessary to consider more realistic material properties in real practical applications. This paper shows an enhancement of this algorithm to dealing with limited kinematic hardening, a typical property of the usual steels. This is done using internal thermodynamic variables. A discrete algorithm is obtained using a plane stress, mixed finite element, with internal variable. An example, a beam encased in an end, under constant axial force and variable moment is presented to show the importance of considering the limited kinematic hardening in a shakedown analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In design or safety assessment of mechanical structures, the use of the Design by Analysis (DBA) route is a modern trend. However, for making possible to apply DBA to structures under variable loads, two basic failure modes considered by ASME or European Standards must be precluded. Those modes are the alternate plasticity and incremental collapse (with instantaneous plastic collapse as a particular case). Shakedown theory is a tool that permit us to assure that those kinds of failures will be avoided. However, in practical applications, very large nonlinear optimization problems are generated. Due to this facts, only in recent years have been possible to obtain algorithms sufficiently accurate, robust and efficient, for dealing with this class of problems. In this paper, one of these shakedown algorithms, developed for dealing with elastic ideally-plastic structures, is enhanced to include limited kinematic hardening, a more realistic material behavior. This is done in the continuous model by using internal thermodynamic variables. A corresponding discrete model is obtained using an axisymmetric mixed finite element with an internal variable. A thick wall sphere, under variable thermal and pressure loads, is used in an example to show the importance of considering the limited kinematic hardening in the shakedown calculations

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ce mémoire s’intéresse à l’étude du critère de validation croisée pour le choix des modèles relatifs aux petits domaines. L’étude est limitée aux modèles de petits domaines au niveau des unités. Le modèle de base des petits domaines est introduit par Battese, Harter et Fuller en 1988. C’est un modèle de régression linéaire mixte avec une ordonnée à l’origine aléatoire. Il se compose d’un certain nombre de paramètres : le paramètre β de la partie fixe, la composante aléatoire et les variances relatives à l’erreur résiduelle. Le modèle de Battese et al. est utilisé pour prédire, lors d’une enquête, la moyenne d’une variable d’intérêt y dans chaque petit domaine en utilisant une variable auxiliaire administrative x connue sur toute la population. La méthode d’estimation consiste à utiliser une distribution normale, pour modéliser la composante résiduelle du modèle. La considération d’une dépendance résiduelle générale, c’est-à-dire autre que la loi normale donne une méthodologie plus flexible. Cette généralisation conduit à une nouvelle classe de modèles échangeables. En effet, la généralisation se situe au niveau de la modélisation de la dépendance résiduelle qui peut être soit normale (c’est le cas du modèle de Battese et al.) ou non-normale. L’objectif est de déterminer les paramètres propres aux petits domaines avec le plus de précision possible. Cet enjeu est lié au choix de la bonne dépendance résiduelle à utiliser dans le modèle. Le critère de validation croisée sera étudié à cet effet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data mining, as a heatedly discussed term, has been studied in various fields. Its possibilities in refining the decision-making process, realizing potential patterns and creating valuable knowledge have won attention of scholars and practitioners. However, there are less studies intending to combine data mining and libraries where data generation occurs all the time. Therefore, this thesis plans to fill such a gap. Meanwhile, potential opportunities created by data mining are explored to enhance one of the most important elements of libraries: reference service. In order to thoroughly demonstrate the feasibility and applicability of data mining, literature is reviewed to establish a critical understanding of data mining in libraries and attain the current status of library reference service. The result of the literature review indicates that free online data resources other than data generated on social media are rarely considered to be applied in current library data mining mandates. Therefore, the result of the literature review motivates the presented study to utilize online free resources. Furthermore, the natural match between data mining and libraries is established. The natural match is explained by emphasizing the data richness reality and considering data mining as one kind of knowledge, an easy choice for libraries, and a wise method to overcome reference service challenges. The natural match, especially the aspect that data mining could be helpful for library reference service, lays the main theoretical foundation for the empirical work in this study. Turku Main Library was selected as the case to answer the research question: whether data mining is feasible and applicable for reference service improvement. In this case, the daily visit from 2009 to 2015 in Turku Main Library is considered as the resource for data mining. In addition, corresponding weather conditions are collected from Weather Underground, which is totally free online. Before officially being analyzed, the collected dataset is cleansed and preprocessed in order to ensure the quality of data mining. Multiple regression analysis is employed to mine the final dataset. Hourly visits are the independent variable and weather conditions, Discomfort Index and seven days in a week are dependent variables. In the end, four models in different seasons are established to predict visiting situations in each season. Patterns are realized in different seasons and implications are created based on the discovered patterns. In addition, library-climate points are generated by a clustering method, which simplifies the process for librarians using weather data to forecast library visiting situation. Then the data mining result is interpreted from the perspective of improving reference service. After this data mining work, the result of the case study is presented to librarians so as to collect professional opinions regarding the possibility of employing data mining to improve reference services. In the end, positive opinions are collected, which implies that it is feasible to utilizing data mining as a tool to enhance library reference service.