911 resultados para critical approaches
Resumo:
The thermal limits of individual animals were originally proposed as a link between animal physiology and thermal ecology. Although this link is valid in theory, the evaluation of physiological tolerances involves some problems that are the focus of this study. One rationale was that heating rates shall influence upper critical limits, so that ecological thermal limits need to consider experimental heating rates. In addition, if thermal limits are not surpassed in experiments, subsequent tests of the same individual should yield similar results or produce evidence of hardening. Finally, several non-controlled variables such as time under experimental conditions and procedures may affect results. To analyze these issues we conducted an integrative study of upper critical temperatures in a single species, the ant Atta sexdens rubropiosa, an animal model providing large numbers of individuals of diverse sizes but similar genetic makeup. Our specific aims were to test the 1) influence of heating rates in the experimental evaluation of upper critical temperature, 2) assumptions of absence of physical damage and reproducibility, and 3) sources of variance often overlooked in the thermal-limits literature; and 4) to introduce some experimental approaches that may help researchers to separate physiological and methodological issues. The upper thermal limits were influenced by both heating rates and body mass. In the latter case, the effect was physiological rather than methodological. The critical temperature decreased during subsequent tests performed on the same individual ants, even one week after the initial test. Accordingly, upper thermal limits may have been overestimated by our (and typical) protocols. Heating rates, body mass, procedures independent of temperature and other variables may affect the estimation of upper critical temperatures. Therefore, based on our data, we offer suggestions to enhance the quality of measurements, and offer recommendations to authors aiming to compile and analyze databases from the literature.
Resumo:
In the past decade, the advent of efficient genome sequencing tools and high-throughput experimental biotechnology has lead to enormous progress in the life science. Among the most important innovations is the microarray tecnology. It allows to quantify the expression for thousands of genes simultaneously by measurin the hybridization from a tissue of interest to probes on a small glass or plastic slide. The characteristics of these data include a fair amount of random noise, a predictor dimension in the thousand, and a sample noise in the dozens. One of the most exciting areas to which microarray technology has been applied is the challenge of deciphering complex disease such as cancer. In these studies, samples are taken from two or more groups of individuals with heterogeneous phenotypes, pathologies, or clinical outcomes. these samples are hybridized to microarrays in an effort to find a small number of genes which are strongly correlated with the group of individuals. Eventhough today methods to analyse the data are welle developed and close to reach a standard organization (through the effort of preposed International project like Microarray Gene Expression Data -MGED- Society [1]) it is not unfrequant to stumble in a clinician's question that do not have a compelling statistical method that could permit to answer it.The contribution of this dissertation in deciphering disease regards the development of new approaches aiming at handle open problems posed by clinicians in handle specific experimental designs. In Chapter 1 starting from a biological necessary introduction, we revise the microarray tecnologies and all the important steps that involve an experiment from the production of the array, to the quality controls ending with preprocessing steps that will be used into the data analysis in the rest of the dissertation. While in Chapter 2 a critical review of standard analysis methods are provided stressing most of problems that In Chapter 3 is introduced a method to adress the issue of unbalanced design of miacroarray experiments. In microarray experiments, experimental design is a crucial starting-point for obtaining reasonable results. In a two-class problem, an equal or similar number of samples it should be collected between the two classes. However in some cases, e.g. rare pathologies, the approach to be taken is less evident. We propose to address this issue by applying a modified version of SAM [2]. MultiSAM consists in a reiterated application of a SAM analysis, comparing the less populated class (LPC) with 1,000 random samplings of the same size from the more populated class (MPC) A list of the differentially expressed genes is generated for each SAM application. After 1,000 reiterations, each single probe given a "score" ranging from 0 to 1,000 based on its recurrence in the 1,000 lists as differentially expressed. The performance of MultiSAM was compared to the performance of SAM and LIMMA [3] over two simulated data sets via beta and exponential distribution. The results of all three algorithms over low- noise data sets seems acceptable However, on a real unbalanced two-channel data set reagardin Chronic Lymphocitic Leukemia, LIMMA finds no significant probe, SAM finds 23 significantly changed probes but cannot separate the two classes, while MultiSAM finds 122 probes with score >300 and separates the data into two clusters by hierarchical clustering. We also report extra-assay validation in terms of differentially expressed genes Although standard algorithms perform well over low-noise simulated data sets, multi-SAM seems to be the only one able to reveal subtle differences in gene expression profiles on real unbalanced data. In Chapter 4 a method to adress similarities evaluation in a three-class prblem by means of Relevance Vector Machine [4] is described. In fact, looking at microarray data in a prognostic and diagnostic clinical framework, not only differences could have a crucial role. In some cases similarities can give useful and, sometimes even more, important information. The goal, given three classes, could be to establish, with a certain level of confidence, if the third one is similar to the first or the second one. In this work we show that Relevance Vector Machine (RVM) [2] could be a possible solutions to the limitation of standard supervised classification. In fact, RVM offers many advantages compared, for example, with his well-known precursor (Support Vector Machine - SVM [3]). Among these advantages, the estimate of posterior probability of class membership represents a key feature to address the similarity issue. This is a highly important, but often overlooked, option of any practical pattern recognition system. We focused on Tumor-Grade-three-class problem, so we have 67 samples of grade I (G1), 54 samples of grade 3 (G3) and 100 samples of grade 2 (G2). The goal is to find a model able to separate G1 from G3, then evaluate the third class G2 as test-set to obtain the probability for samples of G2 to be member of class G1 or class G3. The analysis showed that breast cancer samples of grade II have a molecular profile more similar to breast cancer samples of grade I. Looking at the literature this result have been guessed, but no measure of significance was gived before.
Resumo:
Technology scaling increasingly emphasizes complexity and non-ideality of the electrical behavior of semiconductor devices and boosts interest on alternatives to the conventional planar MOSFET architecture. TCAD simulation tools are fundamental to the analysis and development of new technology generations. However, the increasing device complexity is reflected in an augmented dimensionality of the problems to be solved. The trade-off between accuracy and computational cost of the simulation is especially influenced by domain discretization: mesh generation is therefore one of the most critical steps and automatic approaches are sought. Moreover, the problem size is further increased by process variations, calling for a statistical representation of the single device through an ensemble of microscopically different instances. The aim of this thesis is to present multi-disciplinary approaches to handle this increasing problem dimensionality in a numerical simulation perspective. The topic of mesh generation is tackled by presenting a new Wavelet-based Adaptive Method (WAM) for the automatic refinement of 2D and 3D domain discretizations. Multiresolution techniques and efficient signal processing algorithms are exploited to increase grid resolution in the domain regions where relevant physical phenomena take place. Moreover, the grid is dynamically adapted to follow solution changes produced by bias variations and quality criteria are imposed on the produced meshes. The further dimensionality increase due to variability in extremely scaled devices is considered with reference to two increasingly critical phenomena, namely line-edge roughness (LER) and random dopant fluctuations (RD). The impact of such phenomena on FinFET devices, which represent a promising alternative to planar CMOS technology, is estimated through 2D and 3D TCAD simulations and statistical tools, taking into account matching performance of single devices as well as basic circuit blocks such as SRAMs. Several process options are compared, including resist- and spacer-defined fin patterning as well as different doping profile definitions. Combining statistical simulations with experimental data, potentialities and shortcomings of the FinFET architecture are analyzed and useful design guidelines are provided, which boost feasibility of this technology for mainstream applications in sub-45 nm generation integrated circuits.
Resumo:
Wine grape must deal with serious problems due to the unfavorable climatic conditions resulted from global warming. High temperatures result in oxidative damages to grape vines. The excessive elevated temperatures are critical for grapevine productivity and survival and contribute to degradation of grape and wine quality and yield. Elevated temperature can negatively affect anthocyanin accumulation in red grape. Particularly, cv. Sangiovese was identified to be very sensitive to such condition. The quantitative real-time PCR analysis showed that flavonoid biosynthetic genes were slightly repressed by high temperature. Also, the heat stress repressed the expression of the transcription factor “VvMYBA1” that activates the expression of UFGT. Moreover, high temperatures had repressing effects on the activity of the flavonoids biosynthetic enzymes “PAL” and “UFGT”.Anthocyanin accumulation in berry skin is due to the balance between its synthesis and oxidation. In grape cv. Sangiovese, the gene transcription and activity of peroxidases enzyme was elevated by heat stress as a defensive mechanism of ROS-scavenging. Among many isoforms of peroxidases genes, one gene (POD 1) was induced in Sangiovese under thermal stress condition. This gene was isolated and evaluated via the technique of genes transformation from grape to Petunia. Reduction in anthocyanins concentration and higher enzymatic activity of peroxidase was observed in POD 1 transformed Petunia after heat shock compared to untrasformed control. Moreover, in wine producing regions, it is inevitable for the grape growers to adopt some adaptive strategies to alleviate grape damages to abiotic stresses. Therefore, in this thesis, the technique of post veraison trimming was done to improve the coupling of phenolic and sugar ripening in Vitis vinifera L. cultivar Sangiovese. Trimming after veraison showed to be executable to slow down the rate of sugar accumulation in grape (to decrease the alcohol potential in wines) without evolution of the main berry flavonoids compounds.
Resumo:
Liquids under the influence of external fields exhibit a wide range of intriguing phenomena that can be markedly different from the behaviour of a quiescent system. This work considers two different systems — a glassforming Yukawa system and a colloid-polymer mixture — by Molecular Dynamics (MD) computer simulations coupled to dissipative particle dynamics. The former consists of a 50-50 binary mixture of differently-sized, like-charged colloids interacting via a screened Coulomb (Yukawa) potential. Near the glass transition the influence of an external shear field is studied. In particular, the transition from elastic response to plastic flow is of interest. At first, this model is characterised in equilibrium. Upon decreasing temperature it exhibits the typical dynamics of glassforming liquids, i.e. the structural relaxation time τα grows strongly in a rather small temperature range. This is discussed with respect to the mode-coupling theory of the glass transition (MCT). For the simulation of bulk systems under shear, Lees-Edwards boundary conditions are applied. At constant shear rates γ˙ ≫ 1/τα the relevant time scale is given by 1/γ˙ and the system shows shear thinning behaviour. In order to understand the pronounced differences between a quiescent system and a system under shear, the response to a suddenly commencing or terminating shear flow is studied. After the switch-on of the shear field the shear stress shows an overshoot, marking the transition from elastic to plastic deformation, which is connected to a super-diffusive increase of the mean squared displacement. Since the average static structure only depends on the value of the shear stress, it does not discriminate between those two regimes. The distribution of local stresses, in contrast, becomes broader as soon as the system starts flowing. After a switch-off of the shear field, these additional fluctuations are responsible for the fast decay of stresses, which occurs on a time scale 1/γ˙ . The stress decay after a switch-off in the elastic regime, on the other hand, happens on the much larger time scale of structural relaxation τα. While stresses decrease to zero after a switch-off for temperatures above the glass transition, they decay to a finite value for lower temperatures. The obtained results are important for advancing new theoretical approaches in the framework of mode-coupling theory. Furthermore, they suggest new experimental investigations on colloidal systems. The colloid-polymer mixture is studied in the context of the behaviour near the critical point of phase separation. For the MD simulations a new effective model with soft interaction potentials is introduced and its phase diagram is presented. Here, mainly the equilibrium properties of this model are characterised. While the self-diffusion constants of colloids and polymers do not change strongly when the critical point is approached, critical slowing down of interdiffusion is observed. The order parameter fluctuations can be determined through the long-wavelength limit of static structure factors. For this strongly asymmetric mixture it is shown how the relevant structure factor can be extracted by a diagonalisation of a matrix that contains the partial static structure factors. By presenting first results of this model under shear it is demonstrated that it is suitable for non-equilibrium simulations as well.
Resumo:
Naive T cells continuously recirculate between secondary lymphoid tissue via the blood and lymphatic systems, a process that maximizes the chances of an encounter between a T cell and its cognate antigen. This recirculation depends on signals from chemokine receptors, integrins, and the sphingosine-1-phosphate receptor. The authors of previous studies in other cell types have shown that Rac GTPases transduce signals leading to cell migration and adhesion; however, their roles in T cells are unknown. By using both 3-dimensional intravital and in vitro approaches, we show that Rac1- and Rac2-deficient T cells have multiple defects in this recirculation process. Rac-deficient T cells home very inefficiently to lymph nodes and the white pulp of the spleen, show reduced interstitial migration within lymph node parenchyma, and are defective in egress from lymph nodes. These mutant T cells show defective chemokine-induced chemotaxis, chemokinesis, and adhesion to integrin ligands. They have reduced lateral motility on endothelial cells and transmigrate in-efficiently. These multiple defects stem from critical roles for Rac1 and Rac2 in transducing chemokine and sphingosine-1-phosphate receptor 1 signals leading to motility and adhesion.
Resumo:
Pollinating insects form a key component of European biodiversity, and provide a vital ecosystem service to crops and wild plants. There is growing evidence of declines in both wild and domesticated pollinators, and parallel declines in plants relying upon them. The STEP project (Status and Trends of European Pollinators, 2010-2015, www.step-project.net) is documenting critical elements in the nature and extent of these declines, examining key functional traits associated with pollination deficits, and developing a Red List for some European pollinator groups. Together these activities are laying the groundwork for future pollinator monitoring programmes. STEP is also assessing the relative importance of potential drivers of pollinator declines, including climate change, habitat loss and fragmentation, agrochemicals, pathogens, alien species, light pollution, and their interactions. We are measuring the ecological and economic impacts of declining pollinator services and floral resources, including effects on wild plant populations, crop production and human nutrition. STEP is reviewing existing and potential mitigation options, and providing novel tests of their effectiveness across Europe. Our work is building upon existing and newly developed datasets and models, complemented by spatially-replicated campaigns of field research to fill gaps in current knowledge. Findings are being integrated into a policy-relevant framework to create evidence-based decision support tools. STEP is establishing communication links to a wide range of stakeholders across Europe and beyond, including policy makers, beekeepers, farmers, academics and the general public. Taken together, the STEP research programme aims to improve our understanding of the nature, causes, consequences and potential mitigation of declines in pollination services at local, national, continental and global scales.
Resumo:
At the end of the 20th century we live in a pluralist world in which national and ethnic identities play an appreciable role, sometimes provoking serious conflicts. Nationalist values seem to pose a serious challenge to liberal ones, particularly in the post-communist countries. Malinova asked whether liberalism must necessarily be contrasted with nationalism. Although nationalist issues has never been a major concern for liberal thinkers, in many countries they have had to take such issues into consideration and a form of 'liberalism nationalism' has its place in the history of political ideas. Some of the thinkers who tried to develop such an idea were liberals in the strict sense of the word and others were not, but all of them tried to elaborate a concept of nationalism that respected the rights of individuals and precluded discrimination on ethnic grounds. Malinova studied the history of the conceptualisation of nations and nationalism in the writings, of J.S. Mill, J.E.E. Acton, G. Mazzini, V. Soloviev, B. Chicherin, P. Struve, P. Miljoukov and T.G. Masaryk. Although it cannot be said that these theories form a coherent tradition, certain common elements of the different approaches can be identified. Malinova analysed the way that liberal nationalists interpreted the phenomenon of the nation and its rights in different historical contexts, reviewed the structure of their arguments and tried to evaluate this theoretical experience from the perspective of the contemporary debate on the problems of liberal nationalism and multiculturalism and recent debates on 'the national idea' in Russia.
Resumo:
Simulations of forest stand dynamics in a modelling framework including Forest Vegetation Simulator (FVS) are diameter driven, thus the diameter or basal area increment model needs a special attention. This dissertation critically evaluates diameter or basal area increment models and modelling approaches in the context of the Great Lakes region of the United States and Canada. A set of related studies are presented that critically evaluate the sub-model for change in individual tree basal diameter used in the Forest Vegetation Simulator (FVS), a dominant forestry model in the Great Lakes region. Various historical implementations of the STEMS (Stand and Tree Evaluation and Modeling System) family of diameter increment models, including the current public release of the Lake States variant of FVS (LS-FVS), were tested for the 30 most common tree species using data from the Michigan Forest Inventory and Analysis (FIA) program. The results showed that current public release of the LS-FVS diameter increment model over-predicts 10-year diameter increment by 17% on average. Also the study affirms that a simple adjustment factor as a function of a single predictor, dbh (diameter at breast height) used in the past versions, provides an inadequate correction of model prediction bias. In order to re-engineer the basal diameter increment model, the historical, conceptual and philosophical differences among the individual tree increment model families and their modelling approaches were analyzed and discussed. Two underlying conceptual approaches toward diameter or basal area increment modelling have been often used: the potential-modifier (POTMOD) and composite (COMP) approaches, which are exemplified by the STEMS/TWIGS and Prognosis models, respectively. It is argued that both approaches essentially use a similar base function and neither is conceptually different from a biological perspective, even though they look different in their model forms. No matter what modelling approach is used, the base function is the foundation of an increment model. Two base functions – gamma and Box-Lucas – were identified as candidate base functions for forestry applications. The results of a comparative analysis of empirical fits showed that quality of fit is essentially similar, and both are sufficiently detailed and flexible for forestry applications. The choice of either base function in order to model diameter or basal area increment is dependent upon personal preference; however, the gamma base function may be preferred over the Box-Lucas, as it fits the periodic increment data in both a linear and nonlinear composite model form. Finally, the utility of site index as a predictor variable has been criticized, as it has been widely used in models for complex, mixed species forest stands though not well suited for this purpose. An alternative to site index in an increment model was explored, using site index and a combination of climate variables and Forest Ecosystem Classification (FEC) ecosites and data from the Province of Ontario, Canada. The results showed that a combination of climate and FEC ecosites variables can replace site index in the diameter increment model.
Resumo:
Balancing the frequently conflicting priorities of conservation and economic development poses a challenge to management of the Swiss Alps Jungfrau-Aletsch World Heritage Site (WHS). This is a complex societal problem that calls for a knowledge-based solution. This in turn requires a transdisciplinary research framework in which problems are defined and solved cooperatively by actors from the scientific community and the life-world. In this article we re-examine studies carried out in the region of the Swiss Alps Jungfrau-Aletsch WHS, covering three key issues prevalent in transdisciplinary settings: integration of stakeholders into participatory processes; perceptions and positions; and negotiability and implementation. In the case of the Swiss Alps Jungfrau-Aletsch WHS the transdisciplinary setting created a situation of mutual learning among stakeholders from different levels and backgrounds. However, the studies showed that the benefits of such processes of mutual learning are continuously at risk of being diminished by the power play inherent in participatory approaches.
Resumo:
In his compelling case study of local governance and community safety in the UK Thames Valley, Kevin Stenson makes several important contributions to the field of governmentality studies. While the paper’s merits are far-reaching, to this reader’s assessment they can be summarized in the following key areas: 1) Empirically, the article enhances our knowledge of the political economic transformation of a region otherwise overlooked in social science research ; 2) Conceptually, Stenson offers several theoretical and analytical refrains that, while becoming increasingly commonplace, are nonetheless still germane and rightly oriented to offer push back against otherwise totalizing, reified accounts of roll back/roll out neoliberalism. A welcomed new approach is offered as a corrective, The Realist Governmentality perspective, which emphasizes the interrelated and co-constitutive nature of politics, local culture, and habitus in processes related to the restructuring of social governance; 3) Methodologically, the paper makes a pitch for the ways in which finely grained, nuanced, mixed-method/ethnographic analyses have the potential to further problematize and recast a field of governmentality studies far too often dominated by discursive and textual approaches.
Resumo:
The welfare sector has seen considerable changes in its operational context. Welfare services respond to an increasing number of challenges as citizens are confronted with life’s uncertainties and a variety of complex situations. At the same time the service-delivery system is facing problems of co-operation and the development of staff competence, as well as demands to improve service effectiveness and outcomes. In order to ensure optimal user outcomes in this complex, evolving environment it is necessary to enhance professional knowledge and skills, and to increase efforts to develop the services. Changes are also evident in the new emergent knowledge-production models. There has been a shift from knowledge acquisition and transmission to its construction and production. New actors have stepped in and the roles of researchers are subject to critical discussion. Research outcomes, in other words the usefulness of research with respect to practice development, is a topical agenda item. Research is needed, but if it is to be useful it needs to be not only credible but also useful in action. What do we know about different research processes in practice? What conceptions, approaches, methods and actor roles are embedded? What is the effect on practice? How does ‘here and now’ practice challenge research methods? This article is based on the research processes conducted in the institutes of practice research in social work in Finland. It analyses the different approaches applied by elucidating the theoretical standpoints and the critical elements embedded in them, and reflects on the outcomes in and for practice. It highlights the level of change and progression in practice research, arguing for diverse practice research models with a solid theoretical grounding, rigorous research processes, and a supportive infrastructure.
Resumo:
The volume consists of twenty-five chapters selected from among peer-reviewed papers presented at the CELDA (Cognition and Exploratory Learning in the Digital Age) 2013 Conference held in Fort Worth, Texas, USA, in October 2013 and also from world class scholars in e-learning systems, environments and approaches. The following sub-topics are included: Exploratory Learning Technologies (Part I), e-Learning social web design (Part II), Learner communities through e-Learning implementations (Part III), Collaborative and student-centered e-Learning design (Part IV). E-Learning has been, since its initial stages, a synonym for flexibility. While this dynamic nature has mainly been associated with time and space it is safe to argue that currently it embraces other aspects such as the learners’ profile, the scope of subjects that can be taught electronically and the technology it employs. New technologies also widen the range of activities and skills developed in e-Learning. Electronic learning environments have evolved past the exclusive delivery of knowledge. Technology has endowed e-Learning with the possibility of remotely fomenting problem solving skills, critical thinking and team work, by investing in information exchange, collaboration, personalisation and community building.
Resumo:
Due to its scope and depth, Moore’s Causation and Responsibility is probably the most important publication in the philosophy of law since the publication of Hart’s and Honoré’s Causation in the Law in 1959. This volume offers, for the first time, a detailed exchange between legal and philosophical scholars over Moore’s most recent work. In particular, it pioneers the dialogue between English-speaking and German philosophy of law on a broad range of pressing foundational questions concerning causation in the law. It thereby fulfills the need for a comprehensive, international and critical discussion of Moore’s influential arguments. The 15 contributors to the proposed volume span the whole interdisciplinary field from law and morals to metaphysics, and the authors include distinguished criminal and tort lawyers, as well as prominent theoretical and practical philosophers from four nations. In addition, young researchers take brand-new approaches in the field. The collection is essential reading for anyone interested in legal and moral theory.
Resumo:
PURPOSE To assess the extent of early recoil in patients with critical limb ischemia (CLI) undergoing conventional tibial balloon angioplasty. METHODS Our hypothesis was that early recoil, defined as lumen compromise >10%, is frequent and accounts for considerable luminal narrowing after tibial angioplasty, promoting restenosis. To test this theory, 30 consecutive CLI patients (18 men; mean age 76.2±12.1 years) were angiographically evaluated immediately after tibial balloon angioplasty and 15 minutes later. Half the patients were diabetics. Target lesions included anterior and posterior tibial arteries and the peroneal artery with / without the tibioperoneal trunk. Mean tibial lesion length was 83.8 mm. Early elastic recoil was determined on the basis of minimal lumen diameter (MLD) measurements at baseline (MLDbaseline), immediately after tibial balloon angioplasty (MLDpostdilation), and 15 minutes thereafter (MLD15min). RESULTS Elastic recoil was observed in 29 (97%) patients with a mean luminal compromise of 29% according to MLD measurements (MLDbaseline 0.23 mm, MLD postdilation 2.0 mm, and MLD15min 1.47 mm). CONCLUSION Early recoil is frequently observed in CLI patients undergoing tibial angioplasty and may significantly contribute to restenosis. These findings support the role of dedicated mechanical scaffolding approaches for the prevention of restenosis in tibial arteries.