10 resultados para Model methodology of empirical research in communication
em Universidade do Minho
Resumo:
This paper is a study of the full content of articles published by RPER, the Portuguese Review of Regional Studies, from the time it was launched in 2003 until the first quarter of 2015. RPER is a journal edited by the Portuguese section of the European Regional Science Association, which was established in the first half of the 1980s. The Association (APDR) and the journal are the result of contributions by researchers and technicians from different scientific fields, including mainly Economics, Geography, Sociology, Engineering and Architecture. The main focus of these contributions is the socio-economic life of concrete sites, and the way this life is conditioned by resources and capabilities, the historical and cultural heritage and institutions. Content analysis was undertaken to identify the main subjects chosen during the total period under analysis, the nature of the articles published (theoretical or empirical) and the main analytical framework used. The analysis also covers sub-periods to investigate major trends found in terms of subjects chosen and analytical methods, questioning the rationale behind them. The paper concludes with a few notes regarding the social echo the research received and an identification of the main limitations of the research. In the first part of the article, we conduct a summary review of the genesis and evolution of Regional Science at international level to serve as a basis for the empirical approach developed.
Resumo:
Tese de Doutoramento em Tecnologias e Sistemas de Informação
Resumo:
Risk management is an important component of project management. Nevertheless, such process begins with risk assessment and evaluation. In this research project, a detailed analysis of the methodologies used to treat risks in investment projects adopted by the Banco da Amazonia S.A. was made. Investment projects submitted to the FNO (Constitutional Fund for Financing the North) during 2011 and 2012 were considered for that purpose. It was found that the evaluators of this credit institution use multiple indicators for risk assessment which assume a central role in terms of decision-making and contribute for the approval or the rejection of the submitted projects; namely, the proven ability to pay, the financial records of project promotors, several financial restrictions, level of equity, level of financial indebtedness, evidence of the existence of a consumer market, the proven experience of the partners/owners in the business, environmental aspects, etc. Furthermore, the bank has technological systems to support the risk assessment process, an internal communication system and a unique system for the management of operational risk.
Resumo:
The identification of new and druggable targets in bacteria is a critical endeavour in pharmaceutical research of novel antibiotics to fight infectious agents. The rapid emergence of resistant bacteria makes today's antibiotics more and more ineffective, consequently increasing the need for new pharmacological targets and novel classes of antibacterial drugs. A new model that combines the singular value decomposition technique with biological filters comprised of a set of protein properties associated with bacterial drug targets and similarity to protein-coding essential genes of E. coli has been developed to predict potential drug targets in the Enterobacteriaceae family [1]. This model identified 99 potential target proteins amongst the studied bacterial family, exhibiting eight different functions that suggest that the disruption of the activities of these proteins is critical for cells. Out of these candidates, one was selected for target confirmation. To find target modulators, receptor-based pharmacophore hypotheses were built and used in the screening of a virtual library of compounds. Postscreening filters were based on physicochemical and topological similarity to known Gram-negative antibiotics and applied to the retrieved compounds. Screening hits passing all filters were docked into the proteins catalytic groove and 15 of the most promising compounds were purchased from their chemical vendors to be experimentally tested in vitro. To the best of our knowledge, this is the first attempt to rationalize the search of compounds to probe the relevance of this candidate as a new pharmacological target.
Resumo:
Dissertação de mestrado integrado em Biomedical Engineering Biomaterials, Biomechanics and Rehabilitation
Resumo:
Here we focus on factor analysis from a best practices point of view, by investigating the factor structure of neuropsychological tests and using the results obtained to illustrate on choosing a reasonable solution. The sample (n=1051 individuals) was randomly divided into two groups: one for exploratory factor analysis (EFA) and principal component analysis (PCA), to investigate the number of factors underlying the neurocognitive variables; the second to test the "best fit" model via confirmatory factor analysis (CFA). For the exploratory step, three extraction (maximum likelihood, principal axis factoring and principal components) and two rotation (orthogonal and oblique) methods were used. The analysis methodology allowed exploring how different cognitive/psychological tests correlated/discriminated between dimensions, indicating that to capture latent structures in similar sample sizes and measures, with approximately normal data distribution, reflective models with oblimin rotation might prove the most adequate.
Resumo:
Curcumin and caffeine (used as lipophilic and hydrophilic model compounds, respectively) were successfully encapsulated in lactoferrin-glycomacropeptide (Lf-GMP) nanohydrogels by thermal gelation showing high encapsulation efficiencies (>90 %). FTIR spectroscopy confirmed the encapsulation of bioactive compounds in Lf-GMP nanohydrogels and revealed that according to the encapsulated compound different interactions occur with the nanohydrogel matrix. The successful encapsulation of bioactive compounds in Lf-GMP nanohydrogels was also confirmed by fluorescence measurements and confocal laser scanning microscopy. TEM images showed that loaded nanohydrogels maintain their spherical shape with sizes of 112 and 126 nm for curcumin and caffeine encapsulated in Lf-GMP nanohydrogels, respectively; in both cases a polydispersity of 0.2 was obtained. The release mechanisms of bioactive compounds through Lf-GMP nanohydrogels were evaluated at pH 2 and pH 7, by fitting the Linear Superimposition Model to the experimental data. The bioactive compounds release was found to be pH-dependent: at pH 2, relaxation is the governing phenomenon for curcumin and caffeine compounds and at pH 7 Ficks diffusion is the main mechanism of caffeine release while curcumin was not released through Lf-GMP nanohydrogels.
Resumo:
Genome-scale metabolic models are valuable tools in the metabolic engineering process, based on the ability of these models to integrate diverse sources of data to produce global predictions of organism behavior. At the most basic level, these models require only a genome sequence to construct, and once built, they may be used to predict essential genes, culture conditions, pathway utilization, and the modifications required to enhance a desired organism behavior. In this chapter, we address two key challenges associated with the reconstruction of metabolic models: (a) leveraging existing knowledge of microbiology, biochemistry, and available omics data to produce the best possible model; and (b) applying available tools and data to automate the reconstruction process. We consider these challenges as we progress through the model reconstruction process, beginning with genome assembly, and culminating in the integration of constraints to capture the impact of transcriptional regulation. We divide the reconstruction process into ten distinct steps: (1) genome assembly from sequenced reads; (2) automated structural and functional annotation; (3) phylogenetic tree-based curation of genome annotations; (4) assembly and standardization of biochemistry database; (5) genome-scale metabolic reconstruction; (6) generation of core metabolic model; (7) generation of biomass composition reaction; (8) completion of draft metabolic model; (9) curation of metabolic model; and (10) integration of regulatory constraints. Each of these ten steps is documented in detail.
Resumo:
This article argues for a cultural perspective to be brought to bear on studies of climate change risk perception. Developing the “circuit of culture” model, the article maintains that the producers and consumers of media texts are jointly engaged in dynamic, meaning-making activities that are context-specific and that change over time. A critical discourse analysis of climate change based on a database of newspaper reports from three U.K. broadsheet papers over the period 1985–2003 is presented. This empirical study identifies three distinct circuits of climate change—1985–1990, 1991–1996, 1997–2003—which are characterized by different framings of risks associated with climate change. The article concludes that there is evidence of social learning as actors build on their experiences in relation to climate change science and policy making. Two important factors in shaping the U.K.’s broadsheet newspapers’ discourse on “dangerous” climate change emerge as the agency of top political figures and the dominant ideological standpoints in different newspapers.
Resumo:
PhD in Chemical and Biological Engineering