979 resultados para Piecewise linear techniques
Resumo:
Ligands and receptors of the TNF superfamily are therapeutically relevant targets in a wide range of human diseases. This chapter describes assays based on ELISA, immunoprecipitation, FACS, and reporter cell lines to monitor interactions of tagged receptors and ligands in both soluble and membrane-bound forms using unified detection techniques. A reporter cell assay that is sensitive to ligand oligomerization can identify ligands with high probability of being active on endogenous receptors. Several assays are also suitable to measure the activity of agonist or antagonist antibodies, or to detect interactions with proteoglycans. Finally, self-interaction of membrane-bound receptors can be evidenced using a FRET-based assay. This panel of methods provides a large degree of flexibility to address questions related to the specificity, activation, or inhibition of TNF-TNF receptor interactions in independent assay systems, but does not substitute for further tests in physiologically relevant conditions.
Resumo:
Objective: To evaluate the safety of the performance of the traditional and protected collection techniques of tracheal aspirate and to identify qualitative and quantitative agreement of the results of microbiological cultures between the techniques. Method: Clinical, prospective, comparative, single-blind research. The sample was composed of 54 patients of >18 years of age, undergoing invasive mechanical ventilation for a period of ≥48 hours and with suspected Ventilator Associated Pneumonia. The two techniques were implemented in the same patient, one immediately after the other, with an order of random execution, according to randomization by specialized software. Results: No significant events occurred oxygen desaturation, hemodynamic instability or tracheobronchial hemorrhage (p<0.05) and, although there were differences in some strains, there was qualitative and quantitative agreement between the techniques (p<0.001). Conclusion: Utilization of the protected technique provided no advantage over the traditional and execution of both techniques was safe for the patient.
Resumo:
Histoire discursive du « cinéma-vérité ». Techniques, controverses, historiographie (1960-1970) retrace l'histoire du succès et de la disgrâce du label « cinéma vérité » en France qui, entre 1960 - date à laquelle Edgar Morin publie son essai programmatique « Pour un nouveau "cinéma vérité" » dans France Observateur - et 1964-65 - moment où la notion commence à perdre en popularité - sert de bannière à un mouvement cinématographique supposé renouveler les rapports entre cinéma et réalité. Une vingtaine de films - comme Chronique d'un été de Jean Rouch et Edgar Morin, Primary de Richard Leacock et Robert Drew, Les Inconnus de la terre ou Regard sur la folie de Mario Ruspoli, Hitler, connais pas de Bertrand Blier, Le Chemin de la mauvaise route de Jean Herman, Le Joli Mai de Chris Marker, La Punition de Jean Rouch ou Pour la Suite du monde de Michel Brault et Pierre Perrault - revendiquent cette étiquette ou y sont associés par la presse hexagonale qui y consacre des centaines d'articles. En effet, la sortie en salles de ces « films-vérité » provoque en France de virulentes controverses qui interrogent aussi bien l'éthique de ces projets où les personnes filmées sont supposées révéler une vérité intime face à la caméra, le statut artistique de ces réalisations, ou l'absence d'un engagement politique marqué des « cinéastes-vérité » devant les questions abordées par les protagonistes (par exemple la Guerre d'Algérie, la jeunesse française, la politique internationale). L'hypothèse à la base de cette recherche est que la production cinématographique qui se réclame du « cinéma-vérité » se caractérise par une étroite corrélation entre film et discours sur le film. D'une part car la première moitié de la décennie est marquée par de nombreuses rencontres entre les « cinéastes vérité », les critiques ou les constructeurs de caméras légères et de magnétophones synchrones ; rencontres qui contribuent à accentuer et à médiatiser les dissensions au sein du mouvement. D'autre part car la particularité de nombreux projets est d'inclure dans le film des séquences méta-discursives où les participants, les réalisateurs ou des experts débattent de la réussite du tournage. Ce travail montre que le succès du mouvement entre 1960 et 1964-65 ne se fait pas malgré une forte polémique, mais qu'au contraire, nombre de longs métrages intègrent la controverse en leur sein, interrogeant, sur un plan symbolique, l'abolition du filtre entre le film et son spectateur. Si les films qui s'inscrivent dans la mouvance du « cinéma vérité » octroient une large place à la confrontation, c'est parce que la « vérité » est pensée comme un processus dialectique, qui émerge dans une dynamique d'échanges (entre les réalisateurs de cette mouvance, entre les protagonistes, entre le film et son public). Les querelles internes ou publiques qui rythment ces quelques années font partie du dispositif « cinéma-vérité » et justifient de faire l'histoire de ce mouvement cinématographique par le biais des discours qu'il a suscité au sein de la cinéphilie française.
Resumo:
Fungal symbionts commonly occur in plants influencing host growth, physiology, and ecology (Carlile et al., 2001). However, while whole-plant growth responses to biotrophic fungi are readily demonstrated, it has been much more difficult to identify and detect the physiological mechanisms responsible. Previous work on the clonal grass Glyceria striata has revealed that the systemic fungal endophyte Epichloë glyceriae has a positive effect on clonal growth of its host (Pan & Clay, 2002; 2003). The latest study from these authors, in this issue (pp. 467- 475), now suggests that increased carbon movement in hosts infected by E. glyceriae may function as one mechanism by which endophytic fungi could increase plant growth. Given the widespread distribution of both clonal plants and symbiotic fungi, this research will have implications for our understanding of the ecology and evolution of fungus-plant associations in natural communities.
Resumo:
A Investigação Operacional vem demonstrando ser uma valiosa ferramenta de gestão nos dias de hoje em que se vive num mercado cada vez mais competitivo. Através da Programação Linear pode-se reproduzir matematicamente um problema de maximização dos resultados ou minimização dos custos de produção com o propósito de auxiliar os gestores na tomada de decisão. A Programação Linear é um método matemático em que a função objectivo e as restrições assumem características lineares, com diversas aplicações no controlo de gestão, envolvendo normalmente problemas de utilização dos recursos disponíveis sujeitos a limitações impostas pelo processo produtivo ou pelo mercado. O objectivo geral deste trabalho é o de propor um modelo de Programação Linear para a programação ou produção e alocação de recursos necessários. Optimizar uma quantidade física designada função objectivo, tendo em conta um conjunto de condicionalismos endógenas às actividades em gestão. O objectivo crucial é dispor um modelo de apoio à gestão contribuindo assim para afectação eficiente de recursos escassos à disposição da unidade económica. Com o trabalho desenvolvido ficou patente a importância da abordagem quantitativa como recurso imprescindível de apoio ao processo de decisão. The operational research has proven to be a valuable management tool today we live in an increasingly competitive market. Through Linear Programming can be mathematically reproduce a problem of maximizing performance or minimizing production costs in order to assist managers in decision making. The Linear Programming is a mathematical method in which the objective function and constraints are linear features, with several applications in the control of management, usually involving problems of resource use are available subject to limitations imposed by the production process or the market. The overall objective of this work is to propose a Linear Programming model for scheduling or production and allocation of necessary resources. Optimizing a physical quantity called the objective function, given a set of endogenous constraints on management thus contributing to efficient allocation of scarce resources available to the economic unit. With the work has demonstrated the importance of the quantitative approach as essential resource to support the decision process.
Resumo:
Due to their relatively small size and central location within the thorax, improvement in signal-to-noise (SNR) is of paramount importance for in vivo coronary vessel wall imaging. Thus, with higher field strengths, coronary vessel wall imaging is likely to benefit from the expected "near linear" proportional gain in SNR. In this study, we demonstrate the feasibility of in vivo human high field (3 T) coronary vessel wall imaging using a free-breathing black blood fast gradient echo technique with respiratory navigator gating and real-time motion correction. With the broader availability of more SNR efficient fast spin echo and spiral techniques, further improvements can be expected.
Resumo:
The mathematical representation of Brunswik s lens model has been usedextensively to study human judgment and provides a unique opportunity to conduct ameta-analysis of studies that covers roughly five decades. Specifically, we analyzestatistics of the lens model equation (Tucker, 1964) associated with 259 different taskenvironments obtained from 78 papers. In short, we find on average fairly high levelsof judgmental achievement and note that people can achieve similar levels of cognitiveperformance in both noisy and predictable environments. Although overall performancevaries little between laboratory and field studies, both differ in terms of components ofperformance and types of environments (numbers of cues and redundancy). An analysisof learning studies reveals that the most effective form of feedback is information aboutthe task. We also analyze empirically when bootstrapping is more likely to occur. Weconclude by indicating shortcomings of the kinds of studies conducted to date, limitationsin the lens model methodology, and possibilities for future research.
Resumo:
We consider the application of normal theory methods to the estimation and testing of a general type of multivariate regressionmodels with errors--in--variables, in the case where various data setsare merged into a single analysis and the observable variables deviatepossibly from normality. The various samples to be merged can differ on the set of observable variables available. We show that there is a convenient way to parameterize the model so that, despite the possiblenon--normality of the data, normal--theory methods yield correct inferencesfor the parameters of interest and for the goodness--of--fit test. Thetheory described encompasses both the functional and structural modelcases, and can be implemented using standard software for structuralequations models, such as LISREL, EQS, LISCOMP, among others. An illustration with Monte Carlo data is presented.
Resumo:
We introduce simple nonparametric density estimators that generalize theclassical histogram and frequency polygon. The new estimators are expressed as linear combination of density functions that are piecewisepolynomials, where the coefficients are optimally chosen in order to minimize the integrated square error of the estimator. We establish the asymptotic behaviour of the proposed estimators, and study theirperformance in a simulation study.
Resumo:
The choice network revenue management model incorporates customer purchase behavioras a function of the offered products, and is the appropriate model for airline and hotel networkrevenue management, dynamic sales of bundles, and dynamic assortment optimization.The optimization problem is a stochastic dynamic program and is intractable. A certainty-equivalencerelaxation of the dynamic program, called the choice deterministic linear program(CDLP) is usually used to generate dyamic controls. Recently, a compact linear programmingformulation of this linear program was given for the multi-segment multinomial-logit (MNL)model of customer choice with non-overlapping consideration sets. Our objective is to obtaina tighter bound than this formulation while retaining the appealing properties of a compactlinear programming representation. To this end, it is natural to consider the affine relaxationof the dynamic program. We first show that the affine relaxation is NP-complete even for asingle-segment MNL model. Nevertheless, by analyzing the affine relaxation we derive a newcompact linear program that approximates the dynamic programming value function betterthan CDLP, provably between the CDLP value and the affine relaxation, and often comingclose to the latter in our numerical experiments. When the segment consideration sets overlap,we show that some strong equalities called product cuts developed for the CDLP remain validfor our new formulation. Finally we perform extensive numerical comparisons on the variousbounds to evaluate their performance.
Resumo:
Standard methods for the analysis of linear latent variable models oftenrely on the assumption that the vector of observed variables is normallydistributed. This normality assumption (NA) plays a crucial role inassessingoptimality of estimates, in computing standard errors, and in designinganasymptotic chi-square goodness-of-fit test. The asymptotic validity of NAinferences when the data deviates from normality has been calledasymptoticrobustness. In the present paper we extend previous work on asymptoticrobustnessto a general context of multi-sample analysis of linear latent variablemodels,with a latent component of the model allowed to be fixed across(hypothetical)sample replications, and with the asymptotic covariance matrix of thesamplemoments not necessarily finite. We will show that, under certainconditions,the matrix $\Gamma$ of asymptotic variances of the analyzed samplemomentscan be substituted by a matrix $\Omega$ that is a function only of thecross-product moments of the observed variables. The main advantage of thisis thatinferences based on $\Omega$ are readily available in standard softwareforcovariance structure analysis, and do not require to compute samplefourth-order moments. An illustration with simulated data in the context ofregressionwith errors in variables will be presented.