855 resultados para Idea of approaches to a number
Resumo:
This paper examines the potential benefits and challenges of regionally managed e-government development initiatives. It examines the current state of e-government in four Caribbean countries – Barbados, Jamaica, Saint Vincent and the Grenadines, and Trinidad and Tobago – in order to establish a broader understanding of the challenges that face e-government initiatives in the region. It also reviews a number of e-government initiatives that have been undertaken through projects managed at a regional level. Based on this analysis, it presents a set of best practices that are recommended to agencies engaged in the task of coordinating the implementation of regionally-based e-government initiatives.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Equisetum giganteum L. (E. giganteum), Equisetaceae, commonly called giant horsetail, is an endemic plant of Central and South America and is used in traditional medicine as diuretic and hemostatic in urinary disorders and in inflammatory conditions among other applications. The chemical composition of the extract EtOH 70% of E. giganteum has shown a clear presence of phenolic compounds derived from caffeic and ferulic acids and flavonoid heterosides derived from quercitin and kaempferol, in addition to styrylpyrones. E. giganteum, mainly at the highest concentrations, showed antimicrobial activity against the relevant microorganisms tested: Escherichia coli, Staphylococcus aureus, and Candida albicans. It also demonstrated antiadherent activity on C. albicans biofilms in an experimental model that is similar to dentures. Moreover, all concentrations tested showed anti-inflammatory activity. The extract did not show cytotoxicity in contact with human cells. These properties might qualify E. giganteum extract to be a promising alternative for the topic treatment and prevention of oral candidiasis and denture stomatitis.
Theoretical approaches to forensic entomology: I. Mathematical model of postfeeding larval dispersal
Resumo:
An overall theoretical approach to model phenomena of interest for forensic entomology is advanced. Efforts are concentrated in identifying biological attributes at the individual, population and community of the arthropod fauna associated with decomposing human corpses and then incorporating these attributes into mathematical models. In particular in this paper a diffusion model of dispersal of post feeding larvae is described for blowflies, which are the most common insects associated with corpses.
Resumo:
In the past decade, the advent of efficient genome sequencing tools and high-throughput experimental biotechnology has lead to enormous progress in the life science. Among the most important innovations is the microarray tecnology. It allows to quantify the expression for thousands of genes simultaneously by measurin the hybridization from a tissue of interest to probes on a small glass or plastic slide. The characteristics of these data include a fair amount of random noise, a predictor dimension in the thousand, and a sample noise in the dozens. One of the most exciting areas to which microarray technology has been applied is the challenge of deciphering complex disease such as cancer. In these studies, samples are taken from two or more groups of individuals with heterogeneous phenotypes, pathologies, or clinical outcomes. these samples are hybridized to microarrays in an effort to find a small number of genes which are strongly correlated with the group of individuals. Eventhough today methods to analyse the data are welle developed and close to reach a standard organization (through the effort of preposed International project like Microarray Gene Expression Data -MGED- Society [1]) it is not unfrequant to stumble in a clinician's question that do not have a compelling statistical method that could permit to answer it.The contribution of this dissertation in deciphering disease regards the development of new approaches aiming at handle open problems posed by clinicians in handle specific experimental designs. In Chapter 1 starting from a biological necessary introduction, we revise the microarray tecnologies and all the important steps that involve an experiment from the production of the array, to the quality controls ending with preprocessing steps that will be used into the data analysis in the rest of the dissertation. While in Chapter 2 a critical review of standard analysis methods are provided stressing most of problems that In Chapter 3 is introduced a method to adress the issue of unbalanced design of miacroarray experiments. In microarray experiments, experimental design is a crucial starting-point for obtaining reasonable results. In a two-class problem, an equal or similar number of samples it should be collected between the two classes. However in some cases, e.g. rare pathologies, the approach to be taken is less evident. We propose to address this issue by applying a modified version of SAM [2]. MultiSAM consists in a reiterated application of a SAM analysis, comparing the less populated class (LPC) with 1,000 random samplings of the same size from the more populated class (MPC) A list of the differentially expressed genes is generated for each SAM application. After 1,000 reiterations, each single probe given a "score" ranging from 0 to 1,000 based on its recurrence in the 1,000 lists as differentially expressed. The performance of MultiSAM was compared to the performance of SAM and LIMMA [3] over two simulated data sets via beta and exponential distribution. The results of all three algorithms over low- noise data sets seems acceptable However, on a real unbalanced two-channel data set reagardin Chronic Lymphocitic Leukemia, LIMMA finds no significant probe, SAM finds 23 significantly changed probes but cannot separate the two classes, while MultiSAM finds 122 probes with score >300 and separates the data into two clusters by hierarchical clustering. We also report extra-assay validation in terms of differentially expressed genes Although standard algorithms perform well over low-noise simulated data sets, multi-SAM seems to be the only one able to reveal subtle differences in gene expression profiles on real unbalanced data. In Chapter 4 a method to adress similarities evaluation in a three-class prblem by means of Relevance Vector Machine [4] is described. In fact, looking at microarray data in a prognostic and diagnostic clinical framework, not only differences could have a crucial role. In some cases similarities can give useful and, sometimes even more, important information. The goal, given three classes, could be to establish, with a certain level of confidence, if the third one is similar to the first or the second one. In this work we show that Relevance Vector Machine (RVM) [2] could be a possible solutions to the limitation of standard supervised classification. In fact, RVM offers many advantages compared, for example, with his well-known precursor (Support Vector Machine - SVM [3]). Among these advantages, the estimate of posterior probability of class membership represents a key feature to address the similarity issue. This is a highly important, but often overlooked, option of any practical pattern recognition system. We focused on Tumor-Grade-three-class problem, so we have 67 samples of grade I (G1), 54 samples of grade 3 (G3) and 100 samples of grade 2 (G2). The goal is to find a model able to separate G1 from G3, then evaluate the third class G2 as test-set to obtain the probability for samples of G2 to be member of class G1 or class G3. The analysis showed that breast cancer samples of grade II have a molecular profile more similar to breast cancer samples of grade I. Looking at the literature this result have been guessed, but no measure of significance was gived before.
Resumo:
Technology scaling increasingly emphasizes complexity and non-ideality of the electrical behavior of semiconductor devices and boosts interest on alternatives to the conventional planar MOSFET architecture. TCAD simulation tools are fundamental to the analysis and development of new technology generations. However, the increasing device complexity is reflected in an augmented dimensionality of the problems to be solved. The trade-off between accuracy and computational cost of the simulation is especially influenced by domain discretization: mesh generation is therefore one of the most critical steps and automatic approaches are sought. Moreover, the problem size is further increased by process variations, calling for a statistical representation of the single device through an ensemble of microscopically different instances. The aim of this thesis is to present multi-disciplinary approaches to handle this increasing problem dimensionality in a numerical simulation perspective. The topic of mesh generation is tackled by presenting a new Wavelet-based Adaptive Method (WAM) for the automatic refinement of 2D and 3D domain discretizations. Multiresolution techniques and efficient signal processing algorithms are exploited to increase grid resolution in the domain regions where relevant physical phenomena take place. Moreover, the grid is dynamically adapted to follow solution changes produced by bias variations and quality criteria are imposed on the produced meshes. The further dimensionality increase due to variability in extremely scaled devices is considered with reference to two increasingly critical phenomena, namely line-edge roughness (LER) and random dopant fluctuations (RD). The impact of such phenomena on FinFET devices, which represent a promising alternative to planar CMOS technology, is estimated through 2D and 3D TCAD simulations and statistical tools, taking into account matching performance of single devices as well as basic circuit blocks such as SRAMs. Several process options are compared, including resist- and spacer-defined fin patterning as well as different doping profile definitions. Combining statistical simulations with experimental data, potentialities and shortcomings of the FinFET architecture are analyzed and useful design guidelines are provided, which boost feasibility of this technology for mainstream applications in sub-45 nm generation integrated circuits.
Resumo:
Due to the growing attention of consumers towards their food, improvement of quality of animal products has become one of the main focus of research. To this aim, the application of modern molecular genetics approaches has been proved extremely useful and effective. This innovative drive includes all livestock species productions, including pork. The Italian pig breeding industry is unique because needs heavy pigs slaughtered at about 160 kg for the production of high quality processed products. For this reason, it requires precise meat quality and carcass characteristics. Two aspects have been considered in this thesis: the application of the transcriptome analysis in post mortem pig muscles as a possible method to evaluate meat quality parameters related to the pre mortem status of the animals, including health, nutrition, welfare, and with potential applications for product traceability (chapters 3 and 4); the study of candidate genes for obesity related traits in order to identify markers associated with fatness in pigs that could be applied to improve carcass quality (chapters 5, 6, and 7). Chapter three addresses the first issue from a methodological point of view. When we considered this issue, it was not obvious that post mortem skeletal muscle could be useful for transcriptomic analysis. Therefore we demonstrated that the quality of RNA extracted from skeletal muscle of pigs sampled at different post mortem intervals (20 minutes, 2 hours, 6 hours, and 24 hours) is good for downstream applications. Degradation occurred starting from 48 h post mortem even if at this time it is still possible to use some RNA products. In the fourth chapter, in order to demonstrate the potential use of RNA obtained up to 24 hours post mortem, we present the results of RNA analysis with the Affymetrix microarray platform that made it possible to assess the level of expression of more of 24000 mRNAs. We did not identify any significant differences between the different post mortem times suggesting that this technique could be applied to retrieve information coming from the transcriptome of skeletal muscle samples not collected just after slaughtering. This study represents the first contribution of this kind applied to pork. In the fifth chapter, we investigated as candidate for fat deposition the TBC1D1 [TBC1 (tre-2/USP6, BUB2, cdc16) gene. This gene is involved in mechanisms regulating energy homeostasis in skeletal muscle and is associated with predisposition to obesity in humans. By resequencing a fragment of the TBC1D1 gene we identified three synonymous mutations localized in exon 2 (g.40A>G, g.151C>T, and g.172T>C) and 2 polymorphisms localized in intron 2 (g.219G>A and g.252G>A). One of these polymorphisms (g.219G>A) was genotyped by high resolution melting (HRM) analysis and PCR-RFLP. Moreover, this gene sequence was mapped by radiation hybrid analysis on porcine chromosome 8. The association study was conducted in 756 performance tested pigs of Italian Large White and Italian Duroc breeds. Significant results were obtained for lean meat content, back fat thickness, visible intermuscular fat and ham weight. In chapter six, a second candidate gene (tribbles homolog 3, TRIB3) is analyzed in a study of association with carcass and meat quality traits. The TRIB3 gene is involved in energy metabolism of skeletal muscle and plays a role as suppressor of adipocyte differentiation. We identified two polymorphisms in the first coding exon of the porcine TRIB3 gene, one is a synonymous SNP (c.132T> C), a second is a missense mutation (c.146C> T, p.P49L). The two polymorphisms appear to be in complete linkage disequilibrium between and within breeds. The in silico analysis of the p.P49L substitution suggests that it might have a functional effect. The association study in about 650 pigs indicates that this marker is associated with back fat thickness in Italian Large White and Italian Duroc breeds in two different experimental designs. This polymorphisms is also associated with lactate content of muscle semimembranosus in Italian Large White pigs. Expression analysis indicated that this gene is transcribed in skeletal muscle and adipose tissue as well as in other tissues. In the seventh chapter, we reported the genotyping results for of 677 SNPs in extreme divergent groups of pigs chosen according to the extreme estimated breeding values for back fat thickness. SNPs were identified by resequencing, literature mining and in silico database mining. analysis, data reported in the literature of 60 candidates genes for obesity. Genotyping was carried out using the GoldenGate (Illumina) platform. Of the analyzed SNPs more that 300 were polymorphic in the genotyped population and had minor allele frequency (MAF) >0.05. Of these SNPs, 65 were associated (P<0.10) with back fat thickness. One of the most significant gene marker was the same TBC1D1 SNPs reported in chapter 5, confirming the role of this gene in fat deposition in pig. These results could be important to better define the pig as a model for human obesity other than for marker assisted selection to improve carcass characteristics.
Resumo:
Climate-change related impacts, notably coastal erosion, inundation and flooding from sea level rise and storms, will increase in the coming decades enhancing the risks for coastal populations. Further recourse to coastal armoring and other engineered defenses to address risk reduction will exacerbate threats to coastal ecosystems. Alternatively, protection services provided by healthy ecosystems is emerging as a key element in climate adaptation and disaster risk management. I examined two distinct approaches to coastal defense on the base of their ecological and ecosystem conservation values. First, I analyzed the role of coastal ecosystems in providing services for hazard risk reduction. The value in wave attenuation of coral reefs was quantitatively demonstrated using a meta-analysis approach. Results indicate that coral reefs can provide wave attenuation comparable to hard engineering artificial defenses and at lower costs. Conservation and restoration of existing coral reefs are cost-effective management options for disaster risk reduction. Second, I evaluated the possibility to enhance the ecological value of artificial coastal defense structures (CDS) as habitats for marine communities. I documented the suitability of CDS to support native, ecologically relevant, habitat-forming canopy algae exploring the feasibility of enhancing CDS ecological value by promoting the growth of desired species. Juveniles of Cystoseira barbata can be successfully transplanted at both natural and artificial habitats and not affected by lack of surrounding adult algal individuals nor by substratum orientation. Transplantation success was limited by biotic disturbance from macrograzers on CDS compared to natural habitats. Future work should explore the reasons behind the different ecological functioning of artificial and natural habitats unraveling the factors and mechanisms that cause it. The comprehension of the functioning of systems associated with artificial habitats is the key to allow environmental managers to identify proper mitigation options and to forecast the impact of alternative coastal development plans.
Resumo:
The assessment of historical structures is a significant need for the next generations, as historical monuments represent the community’s identity and have an important cultural value to society. Most of historical structures built by using masonry which is one of the oldest and most common construction materials used in the building sector since the ancient time. Also it is considered a complex material, as it is a composition of brick units and mortar, which affects the structural performance of the building by having different mechanical behaviour with respect to different geometry and qualities given by the components.
Resumo:
For several centuries, Japanese scholars have argued that their nation’s culture—including its language, religion and ways of thinking—is somehow unique. The darker side of this rhetoric, sometimes known by the English term “Japanism” (nihon-jinron), played no small role in the nationalist fervor of the late-nineteenth and early twentieth centuries. While much of the so-called “ideology of Japanese uniqueness” can be dismissed, in terms of the Japanese approach to “religion,” there may be something to it. This paper highlights some distinctive—if not entirely unique—features of the way religion has been categorized and understood in Japanese tradition, contrasting these with Western (i.e., Abrahamic), and to a lesser extent Indian and Chinese understandings. Particular attention is given to the priority of praxis over belief in the Japanese religious context. Des siècles durant, des chercheurs japonais ont soutenu que leur culture – soit leur langue, leur religion et leurs façons de penser – était en quelque sorte unique. Or, sous son jour le plus sombre, cette rhétorique, parfois désignée du terme de « japonisme » (nihon-jinron), ne fut pas sans jouer un rôle déterminant dans la montée de la ferveur nationaliste à la fin du XIXe siècle, ainsi qu’au début du XXe siècle. Bien que l’on puisse discréditer pour l’essentiel cette soi-disant « idéologie de l’unicité japonaise », la conception nippone de la « religion » constitue, quant à elle, un objet d’analyse des plus utiles et pertinents. Cet article met en évidence quelques caractéristiques, sinon uniques du moins distinctives, de la manière dont la religion a été élaborée et comprise au sein de la tradition japonaise, pour ensuite les constrater avec les conceptions occidentale (abrahamique) et, dans une moindre mesure, indienne et chinoise. Une attention toute particulière est ici accordée à la praxis plutôt qu’à la croyance dans le contexte religieux japonais.
Resumo:
The assessment of treatment effects from observational studies may be biased with patients not randomly allocated to the experimental or control group. One way to overcome this conceptual shortcoming in the design of such studies is the use of propensity scores to adjust for differences of the characteristics between patients treated with experimental and control interventions. The propensity score is defined as the probability that a patient received the experimental intervention conditional on pre-treatment characteristics at baseline. Here, we review how propensity scores are estimated and how they can help in adjusting the treatment effect for baseline imbalances. We further discuss how to evaluate adequate overlap of baseline characteristics between patient groups, provide guidelines for variable selection and model building in modelling the propensity score, and review different methods of propensity score adjustments. We conclude that propensity analyses may help in evaluating the comparability of patients in observational studies, and may account for more potential confounding factors than conventional covariate adjustment approaches. However, bias due to unmeasured confounding cannot be corrected for.