822 resultados para Lazzard, Gilbert: Actancy. Empirical approaches to language typology
Resumo:
This paper examines the potential benefits and challenges of regionally managed e-government development initiatives. It examines the current state of e-government in four Caribbean countries – Barbados, Jamaica, Saint Vincent and the Grenadines, and Trinidad and Tobago – in order to establish a broader understanding of the challenges that face e-government initiatives in the region. It also reviews a number of e-government initiatives that have been undertaken through projects managed at a regional level. Based on this analysis, it presents a set of best practices that are recommended to agencies engaged in the task of coordinating the implementation of regionally-based e-government initiatives.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Equisetum giganteum L. (E. giganteum), Equisetaceae, commonly called giant horsetail, is an endemic plant of Central and South America and is used in traditional medicine as diuretic and hemostatic in urinary disorders and in inflammatory conditions among other applications. The chemical composition of the extract EtOH 70% of E. giganteum has shown a clear presence of phenolic compounds derived from caffeic and ferulic acids and flavonoid heterosides derived from quercitin and kaempferol, in addition to styrylpyrones. E. giganteum, mainly at the highest concentrations, showed antimicrobial activity against the relevant microorganisms tested: Escherichia coli, Staphylococcus aureus, and Candida albicans. It also demonstrated antiadherent activity on C. albicans biofilms in an experimental model that is similar to dentures. Moreover, all concentrations tested showed anti-inflammatory activity. The extract did not show cytotoxicity in contact with human cells. These properties might qualify E. giganteum extract to be a promising alternative for the topic treatment and prevention of oral candidiasis and denture stomatitis.
Theoretical approaches to forensic entomology: I. Mathematical model of postfeeding larval dispersal
Resumo:
An overall theoretical approach to model phenomena of interest for forensic entomology is advanced. Efforts are concentrated in identifying biological attributes at the individual, population and community of the arthropod fauna associated with decomposing human corpses and then incorporating these attributes into mathematical models. In particular in this paper a diffusion model of dispersal of post feeding larvae is described for blowflies, which are the most common insects associated with corpses.
Resumo:
Lianas can change forest dynamics, slowing down forest regeneration after a perturbation. In these cases, it may be necessary to manage these woody climbers. Our aim was to simulate two management strategies: (1) focusing on abundant liana species and (2) focusing on the largest lianas, and contrast them with the random removal of lianas. We applied mathematical simulations for liana removal in three different vegetation types in southeastern Brazil: a Rainforest, a Seasonal Tropical Forest, and a Woodland Savanna. Using these samples, we performed simulations based on two liana removal procedures and compared them with random removal. We also used regression analysis with quasi-Poisson distribution to test whether larger lianas were aggressive, i.e., if they climbed into many trees. The procedure of cutting larger lianas was as effective as cutting them randomly and proved not to be a good method for liana management. Moreover, most of the lianas climbed into one or two trees, i.e., were not aggressive. Cutting the most abundant lianas proved to be a more effective method than cutting lianas randomly. This method could maintain liana richness and presumably should accelerate forest regeneration.
Resumo:
The synthesis of chiral-centered selenium compounds is presented. Enantioselective oxidations of these organoselenium compounds were performed using a wide range of biocatalysts, including Baeyer-Villiger monooxygenases, oxidoreductases-containing Aspergillus terreus and lipase (Cal-B) in the presence of oxidants. Finally, efficient synthesis of enantiopure organoselenium compounds using a kinetic resolution approach mediated by Cal-B was achieved. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
This dissertation deals with the period bridging the era of extreme housing shortages in Stockholm on the eve of industrialisation and the much admired programmes of housing provision that followed after the second world war, when Stockholm district Vällingby became an example for underground railway-serviced ”new towns”. It is argued that important changes were made in the housing and town planning policy in Stockholm in this period that paved the way for the successful ensuing period. Foremost among these changes was the uniquely developed practice of municipal leaseholding with the help of site leasehold rights (Erbbaurecht). The study is informed by recent developments in Foucauldian social research, which go under the heading ’governmentality’. Developments within urban planning are understood as different solutions to the problem of urban order. To a large extent, urban and housing policies changed during the period from direct interventions into the lives of inhabitants connected to a liberal understanding of housing provision, to the building of a disciplinary city, and the conduct of ’governmental’ power, building on increased activity on behalf of the local state to provide housing and the integration and co-operation of large collectives. Municipal leaseholding was a fundamental means for the implementation of this policy. When the new policies were introduced, they were limited to the outer parts of the city and administered by special administrative bodies. This administrative and spatial separation was largely upheld throughout the period, and represented as the parallel building of a ’social’ outer city, while things in the inner ’mercantile’ city proceeded more or less as before. This separation was founded in a radical difference in land holding policy: while sites in the inner city were privatised and sold at market values, land in the outer city was mostly leasehold land, distributed according to administrative – and thus politically decided – priorities. These differences were also understood and acknowledged by the inhabitants. Thorough studies of the local press and the organisational life of the southern parts of the outer city reveals that the local identity was tightly connected with the representations connected to the different land holding systems. Inhabitants in the south-western parts of the city, which in this period was still largely built on private sites, displayed a spatial understanding built on the contradictions between centre and periphery. The inhabitants living on leaseholding sites, however, showed a clear understanding of their position as members of model communities, tightly connected to the policy of the municipal administration. The organisations on leaseholding sites also displayed a deep co-operation with the administration. As the analyses of election results show, the inhabitants also seemed to have felt a greater degree of integration with the society at large, than people living in other parts of the city. The leaseholding system in Stockholm has persisted until today and has been one of the strongest in the world, although the local neo-liberal politicians are currently disposing it off.
Resumo:
In the past decade, the advent of efficient genome sequencing tools and high-throughput experimental biotechnology has lead to enormous progress in the life science. Among the most important innovations is the microarray tecnology. It allows to quantify the expression for thousands of genes simultaneously by measurin the hybridization from a tissue of interest to probes on a small glass or plastic slide. The characteristics of these data include a fair amount of random noise, a predictor dimension in the thousand, and a sample noise in the dozens. One of the most exciting areas to which microarray technology has been applied is the challenge of deciphering complex disease such as cancer. In these studies, samples are taken from two or more groups of individuals with heterogeneous phenotypes, pathologies, or clinical outcomes. these samples are hybridized to microarrays in an effort to find a small number of genes which are strongly correlated with the group of individuals. Eventhough today methods to analyse the data are welle developed and close to reach a standard organization (through the effort of preposed International project like Microarray Gene Expression Data -MGED- Society [1]) it is not unfrequant to stumble in a clinician's question that do not have a compelling statistical method that could permit to answer it.The contribution of this dissertation in deciphering disease regards the development of new approaches aiming at handle open problems posed by clinicians in handle specific experimental designs. In Chapter 1 starting from a biological necessary introduction, we revise the microarray tecnologies and all the important steps that involve an experiment from the production of the array, to the quality controls ending with preprocessing steps that will be used into the data analysis in the rest of the dissertation. While in Chapter 2 a critical review of standard analysis methods are provided stressing most of problems that In Chapter 3 is introduced a method to adress the issue of unbalanced design of miacroarray experiments. In microarray experiments, experimental design is a crucial starting-point for obtaining reasonable results. In a two-class problem, an equal or similar number of samples it should be collected between the two classes. However in some cases, e.g. rare pathologies, the approach to be taken is less evident. We propose to address this issue by applying a modified version of SAM [2]. MultiSAM consists in a reiterated application of a SAM analysis, comparing the less populated class (LPC) with 1,000 random samplings of the same size from the more populated class (MPC) A list of the differentially expressed genes is generated for each SAM application. After 1,000 reiterations, each single probe given a "score" ranging from 0 to 1,000 based on its recurrence in the 1,000 lists as differentially expressed. The performance of MultiSAM was compared to the performance of SAM and LIMMA [3] over two simulated data sets via beta and exponential distribution. The results of all three algorithms over low- noise data sets seems acceptable However, on a real unbalanced two-channel data set reagardin Chronic Lymphocitic Leukemia, LIMMA finds no significant probe, SAM finds 23 significantly changed probes but cannot separate the two classes, while MultiSAM finds 122 probes with score >300 and separates the data into two clusters by hierarchical clustering. We also report extra-assay validation in terms of differentially expressed genes Although standard algorithms perform well over low-noise simulated data sets, multi-SAM seems to be the only one able to reveal subtle differences in gene expression profiles on real unbalanced data. In Chapter 4 a method to adress similarities evaluation in a three-class prblem by means of Relevance Vector Machine [4] is described. In fact, looking at microarray data in a prognostic and diagnostic clinical framework, not only differences could have a crucial role. In some cases similarities can give useful and, sometimes even more, important information. The goal, given three classes, could be to establish, with a certain level of confidence, if the third one is similar to the first or the second one. In this work we show that Relevance Vector Machine (RVM) [2] could be a possible solutions to the limitation of standard supervised classification. In fact, RVM offers many advantages compared, for example, with his well-known precursor (Support Vector Machine - SVM [3]). Among these advantages, the estimate of posterior probability of class membership represents a key feature to address the similarity issue. This is a highly important, but often overlooked, option of any practical pattern recognition system. We focused on Tumor-Grade-three-class problem, so we have 67 samples of grade I (G1), 54 samples of grade 3 (G3) and 100 samples of grade 2 (G2). The goal is to find a model able to separate G1 from G3, then evaluate the third class G2 as test-set to obtain the probability for samples of G2 to be member of class G1 or class G3. The analysis showed that breast cancer samples of grade II have a molecular profile more similar to breast cancer samples of grade I. Looking at the literature this result have been guessed, but no measure of significance was gived before.
Resumo:
Technology scaling increasingly emphasizes complexity and non-ideality of the electrical behavior of semiconductor devices and boosts interest on alternatives to the conventional planar MOSFET architecture. TCAD simulation tools are fundamental to the analysis and development of new technology generations. However, the increasing device complexity is reflected in an augmented dimensionality of the problems to be solved. The trade-off between accuracy and computational cost of the simulation is especially influenced by domain discretization: mesh generation is therefore one of the most critical steps and automatic approaches are sought. Moreover, the problem size is further increased by process variations, calling for a statistical representation of the single device through an ensemble of microscopically different instances. The aim of this thesis is to present multi-disciplinary approaches to handle this increasing problem dimensionality in a numerical simulation perspective. The topic of mesh generation is tackled by presenting a new Wavelet-based Adaptive Method (WAM) for the automatic refinement of 2D and 3D domain discretizations. Multiresolution techniques and efficient signal processing algorithms are exploited to increase grid resolution in the domain regions where relevant physical phenomena take place. Moreover, the grid is dynamically adapted to follow solution changes produced by bias variations and quality criteria are imposed on the produced meshes. The further dimensionality increase due to variability in extremely scaled devices is considered with reference to two increasingly critical phenomena, namely line-edge roughness (LER) and random dopant fluctuations (RD). The impact of such phenomena on FinFET devices, which represent a promising alternative to planar CMOS technology, is estimated through 2D and 3D TCAD simulations and statistical tools, taking into account matching performance of single devices as well as basic circuit blocks such as SRAMs. Several process options are compared, including resist- and spacer-defined fin patterning as well as different doping profile definitions. Combining statistical simulations with experimental data, potentialities and shortcomings of the FinFET architecture are analyzed and useful design guidelines are provided, which boost feasibility of this technology for mainstream applications in sub-45 nm generation integrated circuits.