937 resultados para Functional Capacity Index
Resumo:
A indústria de aves brasileira destaca-se economicamente, onde um total de 12,3 milhões de toneladas foi produzido no país em 2013. Esta produção em larga escala gera considerável volume de subprodutos, chegando até 35% da ave viva. Tais resíduos são convertidos, por processos tradicionais, em produtos de baixo valor comercial, como por exemplo, farinhas. O processo de variação de pH constitui um importante processo alternativo de obtenção de proteínas com melhores características funcionais e nutricionais. Estudar as variáveis do processo, efetuando aumento dimensional, é fundamental para aplicação das tecnologias desenvolvidas no laboratório e posterior definição final de processos industriais. A produção de isolados proteicos seria uma tecnologia atraente no aproveitamento de subprodutos da indústria de frango, convertendo-os em uma ótima fonte proteica, agregando valor ao produto obtido. Este trabalho teve por objetivo produzir isolados proteicos em diferentes escalas, utilizando subprodutos não comestíveis da indústria de frango. Foi estudada a solubilização das proteínas da matéria-prima (MP) para definir pHs de solubilização e de precipitação isoelétrica. A curva apontou um pH alcalino de 11,0 para etapa de solubilização e de 5,25 para etapa de precipitação proteica. As proteínas obtidas foram caracterizadas quanto sua composição proximal, índice de acidez (IA), índice de peróxidos (IP) e substâncias reativas ao ácido tiobarbitúrico (TBARS) além de propriedades funcionais de solubilidade, capacidade de retenção de água (CRA) e capacidade de retenção de óleo (CRO); e nutricionais de digestibilidade proteica. Comparativamente foram analisadas farinhas de vísceras comerciais nos mesmos parâmetros. Um aumento de escala do processo foi realizado e avaliado pelas mesmas respostas do produto da escala laboratorial. Foi obtido um teor proteico de 82 e 85% em escala laboratorial e aumento de escala, respectivamente, e também uma redução lipídica de 75%, e de cinzas de 85%, em relação à MP. A composição proximal das farinhas analisadas ficou entre 67-72% para proteína bruta, 17-22% para lipídios e 9-15% para cinzas. O IA, apresentou valores de 2,2 e 3,1 meq/g de isolado e de 1,6 a 2,0 meq/g de farinha. Já para IP, obteve-se valores de 0,003 a 0,005 meq/g de isolado e de 0,002 a 0,049 meq/g de farinha. Os índices de TBARS apontaram valores de 0,081 e 0,214 mg MA/g de isolado e 0,041 a 0,128 mg MA/g de farinha. A solubilidade das proteínas do isolado apontou 84 e 81% em pH 3 e 11 respectivamente e de 5% em pH 5, já para farinhas variaram de 22 a 31% em pH de 3 a 11. A CRA obtida no isolado foi 3,1 a 16,5 g água/g de proteína e de 3,8 a 10,9 g água/g de proteína nas farinhas. A CRO ficou em 4,2 mL de óleo/g de proteína do isolados e 2,6 mL de óleo/g de proteína da farinhas. Os isolados proteicos apresentaram 92 e 95% de digestibilidade das proteínas, em comparação aos 84% das farinhas comerciais. Os índices acumulados e apresentados neste trabalho concluíram que foi possível aumentar a escala do processo de variação de pH, sem perder qualidade nos índices físico-químicos e de digestibilidade proteica.
Resumo:
A organização, a gestão e o planejamento de uma unidade de informação compreende várias etapas e envolve os processos e técnicas do campo de pesquisa do profissional do Bibliotecário. Neste estudo pretendemos construir uma proposta de reestruturação da Biblioteca do Centro de Estudos Teológicos das Assembléias de Deus na Paraíba - CETAD/PB. E especificamente: definir um sistema de organização para o acervo que conduza à autonomia do usuário no processo de busca e recuperação da informação; indicar um software de gerenciamento de bibliotecas que supra as necessidades da unidade de informação; conhecer o público alvo, a partir de instrumento de estudo de usuário, a fim de adequar as ferramentas tecnológicas que serão utilizadas; organizar um guia para auxiliar o processo de reestruturação e propor medidas para a regulamentação do funcionamento da biblioteca do CETAD/PB. A metodologia utiliza a abordagem de pesquisa qualitativa, com características do tipo descritiva e exploratória. Adota a pesquisa de campo, para conhecer e detalhar o universo de pesquisa que foi o Centro de Estudos Teológicos das Assembléias de Deus na Paraíba CETAD/PB, bem como os sujeitos da pesquisa, ou seja, os alunos da instituição. O instrumento de coleta dos dados utilizado foi o questionário. Para representar os dados recorre às técnicas e aos recursos estatísticos da pesquisa quantitativa. Com as análises dos dados desvenda o perfil dos seus usuários, constata a insatisfação dos mesmos com relação a organização do acervo, assim como quais ferramentas tecnológicas se adéquam a esse perfil para o aprimoramento nas etapas de tratamento e disseminação dos suportes informacionais, como também no serviços de atendimento ao usuário. Destaca o profissional da informação como gestor nas Unidades de Informação, com atuação que vai além dos procedimentos e técnicas tradicionais da profissão. Palavras-chave: Biblioteca Especializada. Biblioteca – Teologia. Organização de Bibliotecas.
Resumo:
Pulmonary hypertension (PH) is a rare but serious condition that causes progressive right ventricular (RV) failure and death. PH may be idiopathic, associated with underlying connective-tissue disease or hypoxic lung disease, and is also increasingly being observed in the setting of heart failure with preserved ejection fraction (HFpEF). The management of PH has been revolutionised by the recent development of new disease-targeted therapies which are beneficial in pulmonary arterial hypertension (PAH), but can be potentially harmful in PH due to left heart disease, so accurate diagnosis and classification of patients is essential. These PAH therapies improve exercise capacity and pulmonary haemodynamics, but their overall effect on the right ventricle remains unclear. Current practice in the UK is to assess treatment response with 6-minute walk test and NYHA functional class, neither of which truly reflects RV function. Cardiac magnetic resonance (CMR) imaging has been established as the gold standard for the evaluation of right ventricular structure and function, but it also allows a non-invasive and accurate study of the left heart. The aims of this thesis were to investigate the use of CMR in the diagnosis of PH, in the assessment of treatment response, and in predicting survival in idiopathic and connective-tissue disease associated PAH. In Chapter 3, a left atrial volume (LAV) threshold of 43 ml/m2 measured with CMR was able to distinguish idiopathic PAH from PH due to HFpEF (sensitivity 97%, specificity 100%). In Chapter 4, disease-targeted PAH therapy resulted in significant improvements in RV and left ventricular ejection fraction (p<0.001 and p=0.0007, respectively), RV stroke volume index (p<0.0001), and left ventricular end-diastolic volume index (p=0.0015). These corresponded to observed improvements in functional class and exercise capacity, although correlation coefficients between Δ 6MWD and Δ RVEF or Δ LVEDV were low. Finally, in Chapter 5, one-year and three-year survival was worse in CTD-PAH (75% and 53%) than in IPAH (83% and 74%), despite similar baseline clinical characteristics, lung function, pulmonary haemodynamics and treatment. Baseline right ventricular stroke volume index was an independent predictor of survival in both conditions. The presence of LV systolic dysfunction was of prognostic significance in CTD-PAH but not IPAH, and a higher LAV was observed in CTD-PAH suggesting a potential contribution from LV diastolic dysfunction in this group.
Resumo:
Forested areas within cities host a large number of species, responsible for many ecosystem services in urban areas. The biodiversity in these areas is influenced by human disturbances such as atmospheric pollution and urban heat island effect. To ameliorate the effects of these factors, an increase in urban green areas is often considered sufficient. However, this approach assumes that all types of green cover have the same importance for species. Our aim was to show that not all forested green areas are equal in importance for species, but that based on a multi-taxa and functional diversity approach it is possible to value green infrastructure in urban environments. After evaluating the diversity of lichens, butterflies and other-arthropods, birds and mammals in 31 Mediterranean urban forests in south-west Europe (Almada, Portugal), bird and lichen functional groups responsive to urbanization were found. A community shift (tolerant species replacing sensitive ones) along the urbanization gradient was found, and this must be considered when using these groups as indicators of the effect of urbanization. Bird and lichen functional groups were then analyzed together with the characteristics of the forests and their surroundings. Our results showed that, contrary to previous assumptions, vegetation density and more importantly the amount of urban areas around the forest (matrix), are more important for biodiversity than forest quantity alone. This indicated that not all types of forested green areas have the same importance for biodiversity. An index of forest functional diversity was then calculated for all sampled forests of the area. This could help decision-makers to improve the management of urban green infrastructures with the goal of increasing functionality and ultimately ecosystem services in urban areas.
Resumo:
Las organizaciones se pueden asumir como el resultado de las necesidades históricas del entorno y de los sistemas sociales en su proceso evolutivo, y su sostenibilidad depende de la capacidad de entender su propia complejidad. Este artículo propone que la dotación de condiciones para la innovación, como expresión de la cultura organizacional, es una opción garante de sostenibilidad y requiere de un proceso no uniforme, ni predecible, es decir, un proceso complejo. Las reflexiones apuntan a que las empresas, para garantizar su sostenibilidad deben encontrar, cuasiequilibrios altamente dinámicos y transitorios, del núcleo de los requerimientos funcionales de las demandas y las capacidades estructurales de la oferta. Como resultado de esta reflexión se propone que las decisiones instrumentadas a partir de estas capacidades, pueden aceptarse dentro de un rango amplio de estrategias evolutivas eficientes, en un extenso espectro que va desde la adición cercana a los enfoques económicos ortodoxos hasta los actuales de innovación.
Resumo:
In this thesis, we study the causal relationship between functional distribution of income and economic growth. In particular, we focus on some of the aspects that might alter the effect of the profit share on growth. After a brief introduction and literature review, the empirical contributions will be presented in Chapters 3,4 and 5. Chapter 3 analyses the effect of a contemporaneous decrease in the wage share among countries that are major trade partners. Falling wage share and wage moderation are a global phenomenon which are hardly opposed by governments. This is because lower wages are associated with lower export prices and, therefore, have a positive effect on net-exports. There is, however, a fallacy of composition problem: not all countries can improve their balance of payments contemporaneously. Studying the country members of the North America Free Trade Agreement, we find that the effect on export of a contemporaneous decrease in the wage share in Mexico, Canada and the United States, is negative in all countries. In other words, the competitive advantage that each country gains because of a reduction in its wage share (to which is associated a decrease in export prices), is offset by a contemporaneous increase in competitiveness in the other two countries. Moreover, we find that NAFTA is overall wage-led: the profit share has a negative effect on aggregate demand. Chapter 4 tests whether it is possible that the effect of the profit share on growth is different in the long run and in the short run. Following Blecker (2014) our hypothesis is that in the short run the growth regime is less wage-led than it is in the long run. The results of our empirical investigation support this hypothesis, at least for the United States over the period 1950-2014. The effect of wages on consumption increases more than proportionally compared to the effect of profits on consumption from the short to the long run. Moreover, consumer debt seem to have only a short-run effect on consumption indicating that in the long run, when debt has to be repaid, consumption depends more on the level of income and on how it is distributed. Regarding investment, the effect of capacity utilization is always larger than the effect of the profit share and that the difference between the two effects is higher in the long run than in the short run. This confirms the hypothesis that in the long run, unless there is an increase in demand, it is likely that firms are not going to increase investments even in the presence of high profits. In addition, the rentier share of profits – that comprises dividends and interest payments – has a long-run negative effect on investment. In the long run rentiers divert firms’ profits from investment and, therefore, it weakens the effect of profits on investment. Finally, Chapter 5 studies the possibility of structural breaks in the relationship between functional distribution of income and growth. We argue that, from the 1980s, financialization and the European exchange rate agreements weakened the positive effect of the profit share on growth in Italy. The growth regime is therefore becoming less profit-led and more wage-led. Our results confirm this hypothesis and also shed light on the concept of cooperative and conflictual regimes as defined by Bhaduri and Marglin (1990).
Resumo:
Background Many acute stroke trials have given neutral results. Sub-optimal statistical analyses may be failing to detect efficacy. Methods which take account of the ordinal nature of functional outcome data are more efficient. We compare sample size calculations for dichotomous and ordinal outcomes for use in stroke trials. Methods Data from stroke trials studying the effects of interventions known to positively or negatively alter functional outcome – Rankin Scale and Barthel Index – were assessed. Sample size was calculated using comparisons of proportions, means, medians (according to Payne), and ordinal data (according to Whitehead). The sample sizes gained from each method were compared using Friedman 2 way ANOVA. Results Fifty-five comparisons (54 173 patients) of active vs. control treatment were assessed. Estimated sample sizes differed significantly depending on the method of calculation (Po00001). The ordering of the methods showed that the ordinal method of Whitehead and comparison of means produced significantly lower sample sizes than the other methods. The ordinal data method on average reduced sample size by 28% (inter-quartile range 14–53%) compared with the comparison of proportions; however, a 22% increase in sample size was seen with the ordinal method for trials assessing thrombolysis. The comparison of medians method of Payne gave the largest sample sizes. Conclusions Choosing an ordinal rather than binary method of analysis allows most trials to be, on average, smaller by approximately 28% for a given statistical power. Smaller trial sample sizes may help by reducing time to completion, complexity, and financial expense. However, ordinal methods may not be optimal for interventions which both improve functional outcome
Resumo:
Background and Purpose—An early and reliable prognosis for recovery in stroke patients is important for initiation of individual treatment and for informing patients and relatives. We recently developed and validated models for predicting survival and functional independence within 3 months after acute stroke, based on age and the National Institutes of Health Stroke Scale score assessed within 6 hours after stroke. Herein we demonstrate the applicability of our models in an independent sample of patients from controlled clinical trials. Methods—The prognostic models were used to predict survival and functional recovery in 5419 patients from the Virtual International Stroke Trials Archive (VISTA). Furthermore, we tried to improve the accuracy by adapting intercepts and estimating new model parameters. Results—The original models were able to correctly classify 70.4% (survival) and 72.9% (functional recovery) of patients. Because the prediction was slightly pessimistic for patients in the controlled trials, adapting the intercept improved the accuracy to 74.8% (survival) and 74.0% (functional recovery). Novel estimation of parameters, however, yielded no relevant further improvement. Conclusions—For acute ischemic stroke patients included in controlled trials, our easy-to-apply prognostic models based on age and National Institutes of Health Stroke Scale score correctly predicted survival and functional recovery after 3 months. Furthermore, a simple adaptation helps to adjust for a different prognosis and is recommended if a large data set is available. (Stroke. 2008;39:000-000.)
Resumo:
Background: Most large acute stroke trials have been neutral. Functional outcome is usually analysed using a yes or no answer, e.g. death or dependency vs. independence. We assessed which statistical approaches are most efficient in analysing outcomes from stroke trials. Methods: Individual patient data from acute, rehabilitation and stroke unit trials studying the effects of interventions which alter functional outcome were assessed. Outcomes included modified Rankin Scale, Barthel Index, and ‘3 questions’. Data were analysed using a variety of approaches which compare two treatment groups. The results for each statistical test for each trial were then compared. Results: Data from 55 datasets were obtained (47 trials, 54,173 patients). The test results differed substantially so that approaches which use the ordered nature of functional outcome data (ordinal logistic regression, t-test, robust ranks test, bootstrapping the difference in mean rank) were more efficient statistically than those which collapse the data into 2 groups (chi square) (ANOVA p<0.001). The findings were consistent across different types and sizes of trial and for the different measures of functional outcome. Conclusions: When analysing functional outcome from stroke trials, statistical tests which use the original ordered data are more efficient and more likely to yield reliable results. Suitable approaches included ordinal logistic regression, t-test, and robust ranks test.
Resumo:
Gene therapy is one of the major challenges of the post-genomic research and it is based on the transfer of genetic material into a cell, tissue or organ in order to cure or improve the patient s clinical status. In general, gene therapy consists in the insertion of functional genes aiming substitute, complement or inhibit defective genes. The achievement of a foreigner DNA expression into a population of cells requires its transfer to the target. Therefore, a key issue is to create systems, vectors, able to transfer and protect the DNA until it reaches the target. The disadvantages related to the use of viral vectors have encouraged efforts to develop emulsions as non-viral vectors. In fact, they are easy to produce, present suitable stability and enable transfection. The aim of this work was to evaluate two different non-viral vectors, cationic liposomes and nanoemulsions, and the possibility of their use in gene therapy. For the two systems, cationic lipids and helper lipids were used. Nanoemulsions were prepared using sonication method and were composed of Captex® 355; Tween® 80; Spam® 80; cationic lipid, Stearylamine (SA) or 1,2-dioleoyl-3-trimethylammoniumpropane (DOTAP) and water (Milli-Q®). These systems were characterized by average droplet size, Polidispersion Index (PI) and Zeta Potential. The stability of the systems; as well as the DNA compaction capacity; their cytotoxicity and the cytotoxicity of the isolated components; and their transfection capacity; were also evaluated. Liposomes were made by hydration film method and were composed of DOTAP; 1,2-dioleoyl-sn-glycero-3-phosphoethanolamine (DOPE), containing or not Rhodaminephosphatidylethanolamine (PE- Rhodamine) and the conjugate Hyaluronic Acid DOPE (HA-DOPE). These systems were also characterized as nanoemulsions. Stability of the systems and the influence of time, size of plasmid and presence or absence of endotoxin in the formation of lipoplexes were also analyzed. Besides, the ophthalmic biodistribution of PE-Rhodamine containing liposomes was studied after intravitreal injection. The obtained results show that these systems are promising non-viral vector for further utilization in gene therapy and that this field seems to be very important in the clinical practice in this century. However, from the possibility to the practice, there is still a long way
Resumo:
Background and Purpose—Most large acute stroke trials have been neutral. Functional outcome is usually analyzed using a yes or no answer, eg, death or dependency versus independence. We assessed which statistical approaches are most efficient in analyzing outcomes from stroke trials. Methods—Individual patient data from acute, rehabilitation and stroke unit trials studying the effects of interventions which alter functional outcome were assessed. Outcomes included modified Rankin Scale, Barthel Index, and “3 questions”. Data were analyzed using a variety of approaches which compare 2 treatment groups. The results for each statistical test for each trial were then compared. Results—Data from 55 datasets were obtained (47 trials, 54 173 patients). The test results differed substantially so that approaches which use the ordered nature of functional outcome data (ordinal logistic regression, t test, robust ranks test, bootstrapping the difference in mean rank) were more efficient statistically than those which collapse the data into 2 groups (2; ANOVA, P0.001). The findings were consistent across different types and sizes of trial and for the different measures of functional outcome. Conclusions—When analyzing functional outcome from stroke trials, statistical tests which use the original ordered data are more efficient and more likely to yield reliable results. Suitable approaches included ordinal logistic regression, test, and robust ranks test.
Resumo:
Foreseeing functional recovery after stroke plays a crucial role in planning rehabilitation programs. Objectives: To assess differences over time in functional recovery assessed through the Barthel Index (BI) rate of change (BIRC) between admission and discharge in stroke patients. Methods: This is a retrospective hospital-based study of consecutive patients with acute stroke admitted to a hospital in the Northeast Portugal between 2010 and 2014. BIRC was computed as the difference between the admission and discharge BI scores divided by time in days between these assessments. General linear model analysis stratiied by gender was used to know whether there was an increase in BIRC during time period under study. Adjusted regression coeficients and respective 95% conidence interval (95%CI) were obtained. Results: From 483 patients included in this analysis 59% (n = 285) were male. Among women, mean BIRC was 1.8 (± 1.88) units/ day in 2010 and reached 3.7 (± 2.80) units/day in 2014. Among men the mean BIRC in 2010 and in 2014 were similar being 3.2 (± 3.19) and 3.1 (± 3.31) units/day, respectively. After adjustment for age, BI at admission, type and laterality of stroke we observed an increase in BIRC over time among women such that mean BIRC in 2014 was 0.82 (95%: 0.48; 3.69) units higher than the one observed in 2010. No such increase in BIRC over time was observed among men. Conclusions: We observed an improvement in functional recovery after stroke but only among women. Our results suggest differences over time in clinical practice toward rehabilitation of women after stroke.
Resumo:
Introduction Cerebral misery perfusion represents a failure of cerebral autoregulation. It is animportant differential diagnosis in post-stroke patients presenting with collapses in the presence of haemodynamically significant cerebrovascular stenosis. This is particularly the case when cortical or internal watershed infarcts are present. When this condition occurs, further investigation should be done immediately. Case presentation A 50-year-old Caucasian man presented with a stroke secondary to complete occlusion of his left internal carotid artery. He went on to suffer recurrent seizures. Neuroimaging demonstrated numerous new watershed-territory cerebral infarcts. No source of arterial thromboembolism was demonstrable. Hypercapnic blood-oxygenation-level-dependent-contrast functional magnetic resonance imaging was used to measure his cerebrovascular reserve capacity. The findings were suggestive of cerebral misery perfusion. Conclusions Blood-oxygenation-level-dependent-contrast functional magnetic resonance imaging allows the inference of cerebral misery perfusion. This procedure is cheaper and more readily available than positron emission tomography imaging, which is the current gold standard diagnostic test. The most evaluated treatment for cerebral misery perfusion is extracranial-intracranial bypass. Although previous trials of this have been unfavourable, the results of new studies involving extracranial-intracranial bypass in high-risk patients identified during cerebral perfusion imaging are awaited. Cerebral misery perfusion is an important and under-recognized condition in which emerging imaging and treatment modalities present the possibility of practical and evidence-based management in the near future. Physicians should thus be aware of this disorder and of recent developments in diagnostic tests that allow its detection.
Resumo:
Single-cell functional proteomics assays can connect genomic information to biological function through quantitative and multiplex protein measurements. Tools for single-cell proteomics have developed rapidly over the past 5 years and are providing unique opportunities. This thesis describes an emerging microfluidics-based toolkit for single cell functional proteomics, focusing on the development of the single cell barcode chips (SCBCs) with applications in fundamental and translational cancer research.
The microchip designed to simultaneously quantify a panel of secreted, cytoplasmic and membrane proteins from single cells will be discussed at the beginning, which is the prototype for subsequent proteomic microchips with more sophisticated design in preclinical cancer research or clinical applications. The SCBCs are a highly versatile and information rich tool for single-cell functional proteomics. They are based upon isolating individual cells, or defined number of cells, within microchambers, each of which is equipped with a large antibody microarray (the barcode), with between a few hundred to ten thousand microchambers included within a single microchip. Functional proteomics assays at single-cell resolution yield unique pieces of information that significantly shape the way of thinking on cancer research. An in-depth discussion about analysis and interpretation of the unique information such as functional protein fluctuations and protein-protein correlative interactions will follow.
The SCBC is a powerful tool to resolve the functional heterogeneity of cancer cells. It has the capacity to extract a comprehensive picture of the signal transduction network from single tumor cells and thus provides insight into the effect of targeted therapies on protein signaling networks. We will demonstrate this point through applying the SCBCs to investigate three isogenic cell lines of glioblastoma multiforme (GBM).
The cancer cell population is highly heterogeneous with high-amplitude fluctuation at the single cell level, which in turn grants the robustness of the entire population. The concept that a stable population existing in the presence of random fluctuations is reminiscent of many physical systems that are successfully understood using statistical physics. Thus, tools derived from that field can probably be applied to using fluctuations to determine the nature of signaling networks. In the second part of the thesis, we will focus on such a case to use thermodynamics-motivated principles to understand cancer cell hypoxia, where single cell proteomics assays coupled with a quantitative version of Le Chatelier's principle derived from statistical mechanics yield detailed and surprising predictions, which were found to be correct in both cell line and primary tumor model.
The third part of the thesis demonstrates the application of this technology in the preclinical cancer research to study the GBM cancer cell resistance to molecular targeted therapy. Physical approaches to anticipate therapy resistance and to identify effective therapy combinations will be discussed in detail. Our approach is based upon elucidating the signaling coordination within the phosphoprotein signaling pathways that are hyperactivated in human GBMs, and interrogating how that coordination responds to the perturbation of targeted inhibitor. Strongly coupled protein-protein interactions constitute most signaling cascades. A physical analogy of such a system is the strongly coupled atom-atom interactions in a crystal lattice. Similar to decomposing the atomic interactions into a series of independent normal vibrational modes, a simplified picture of signaling network coordination can also be achieved by diagonalizing protein-protein correlation or covariance matrices to decompose the pairwise correlative interactions into a set of distinct linear combinations of signaling proteins (i.e. independent signaling modes). By doing so, two independent signaling modes – one associated with mTOR signaling and a second associated with ERK/Src signaling have been resolved, which in turn allow us to anticipate resistance, and to design combination therapies that are effective, as well as identify those therapies and therapy combinations that will be ineffective. We validated our predictions in mouse tumor models and all predictions were borne out.
In the last part, some preliminary results about the clinical translation of single-cell proteomics chips will be presented. The successful demonstration of our work on human-derived xenografts provides the rationale to extend our current work into the clinic. It will enable us to interrogate GBM tumor samples in a way that could potentially yield a straightforward, rapid interpretation so that we can give therapeutic guidance to the attending physicians within a clinical relevant time scale. The technical challenges of the clinical translation will be presented and our solutions to address the challenges will be discussed as well. A clinical case study will then follow, where some preliminary data collected from a pediatric GBM patient bearing an EGFR amplified tumor will be presented to demonstrate the general protocol and the workflow of the proposed clinical studies.
Resumo:
Communities can be defined as assemblages of species coexisting under particular environments. The relationship between environment and species are regulated by both environmental requirements –which ultimately determine the species capacity to establish and survive in a particular environment– and the ecological interactions occurring during assembly processes –which also determine community composition by conditioning species coexistence. In this context, plant functional traits are attributes that represent ecological strategies and determine how plants respond to environmental factors and interact with other species. Therefore, the analysis of how traits vary through the dynamics of communities, such as along successions, can give insights about how environmental requirements and species interactions may determine the composition and functional structure of these communities. The xerophytic shrub communities inhabiting inland sand dunes in SW Portugal are characterized by successional processes that are mainly driven by local (edaphic gradients and human disturbance) and regional (climate) processes. Therefore, they constitute an appropriate system for studying species interactions and environmentcommunity co-variations based on functional terms. Using these communities as a model, we evaluate the hypothesis that successional community changes in species composition of xerophytic shrub communities can result in concurrent changes in functional diversity