730 resultados para Constructivist approaches


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation deals with the period bridging the era of extreme housing shortages in Stockholm on the eve of industrialisation and the much admired programmes of housing provision that followed after the second world war, when Stockholm district Vällingby became an example for underground railway-serviced ”new towns”. It is argued that important changes were made in the housing and town planning policy in Stockholm in this period that paved the way for the successful ensuing period. Foremost among these changes was the uniquely developed practice of municipal leaseholding with the help of site leasehold rights (Erbbaurecht). The study is informed by recent developments in Foucauldian social research, which go under the heading ’governmentality’. Developments within urban planning are understood as different solutions to the problem of urban order. To a large extent, urban and housing policies changed during the period from direct interventions into the lives of inhabitants connected to a liberal understanding of housing provision, to the building of a disciplinary city, and the conduct of ’governmental’ power, building on increased activity on behalf of the local state to provide housing and the integration and co-operation of large collectives. Municipal leaseholding was a fundamental means for the implementation of this policy. When the new policies were introduced, they were limited to the outer parts of the city and administered by special administrative bodies. This administrative and spatial separation was largely upheld throughout the period, and represented as the parallel building of a ’social’ outer city, while things in the inner ’mercantile’ city proceeded more or less as before. This separation was founded in a radical difference in land holding policy: while sites in the inner city were privatised and sold at market values, land in the outer city was mostly leasehold land, distributed according to administrative – and thus politically decided – priorities. These differences were also understood and acknowledged by the inhabitants. Thorough studies of the local press and the organisational life of the southern parts of the outer city reveals that the local identity was tightly connected with the representations connected to the different land holding systems. Inhabitants in the south-western parts of the city, which in this period was still largely built on private sites, displayed a spatial understanding built on the contradictions between centre and periphery. The inhabitants living on leaseholding sites, however, showed a clear understanding of their position as members of model communities, tightly connected to the policy of the municipal administration. The organisations on leaseholding sites also displayed a deep co-operation with the administration. As the analyses of election results show, the inhabitants also seemed to have felt a greater degree of integration with the society at large, than people living in other parts of the city. The leaseholding system in Stockholm has persisted until today and has been one of the strongest in the world, although the local neo-liberal politicians are currently disposing it off.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many combinatorial problems coming from the real world may not have a clear and well defined structure, typically being dirtied by side constraints, or being composed of two or more sub-problems, usually not disjoint. Such problems are not suitable to be solved with pure approaches based on a single programming paradigm, because a paradigm that can effectively face a problem characteristic may behave inefficiently when facing other characteristics. In these cases, modelling the problem using different programming techniques, trying to ”take the best” from each technique, can produce solvers that largely dominate pure approaches. We demonstrate the effectiveness of hybridization and we discuss about different hybridization techniques by analyzing two classes of problems with particular structures, exploiting Constraint Programming and Integer Linear Programming solving tools and Algorithm Portfolios and Logic Based Benders Decomposition as integration and hybridization frameworks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the past decade, the advent of efficient genome sequencing tools and high-throughput experimental biotechnology has lead to enormous progress in the life science. Among the most important innovations is the microarray tecnology. It allows to quantify the expression for thousands of genes simultaneously by measurin the hybridization from a tissue of interest to probes on a small glass or plastic slide. The characteristics of these data include a fair amount of random noise, a predictor dimension in the thousand, and a sample noise in the dozens. One of the most exciting areas to which microarray technology has been applied is the challenge of deciphering complex disease such as cancer. In these studies, samples are taken from two or more groups of individuals with heterogeneous phenotypes, pathologies, or clinical outcomes. these samples are hybridized to microarrays in an effort to find a small number of genes which are strongly correlated with the group of individuals. Eventhough today methods to analyse the data are welle developed and close to reach a standard organization (through the effort of preposed International project like Microarray Gene Expression Data -MGED- Society [1]) it is not unfrequant to stumble in a clinician's question that do not have a compelling statistical method that could permit to answer it.The contribution of this dissertation in deciphering disease regards the development of new approaches aiming at handle open problems posed by clinicians in handle specific experimental designs. In Chapter 1 starting from a biological necessary introduction, we revise the microarray tecnologies and all the important steps that involve an experiment from the production of the array, to the quality controls ending with preprocessing steps that will be used into the data analysis in the rest of the dissertation. While in Chapter 2 a critical review of standard analysis methods are provided stressing most of problems that In Chapter 3 is introduced a method to adress the issue of unbalanced design of miacroarray experiments. In microarray experiments, experimental design is a crucial starting-point for obtaining reasonable results. In a two-class problem, an equal or similar number of samples it should be collected between the two classes. However in some cases, e.g. rare pathologies, the approach to be taken is less evident. We propose to address this issue by applying a modified version of SAM [2]. MultiSAM consists in a reiterated application of a SAM analysis, comparing the less populated class (LPC) with 1,000 random samplings of the same size from the more populated class (MPC) A list of the differentially expressed genes is generated for each SAM application. After 1,000 reiterations, each single probe given a "score" ranging from 0 to 1,000 based on its recurrence in the 1,000 lists as differentially expressed. The performance of MultiSAM was compared to the performance of SAM and LIMMA [3] over two simulated data sets via beta and exponential distribution. The results of all three algorithms over low- noise data sets seems acceptable However, on a real unbalanced two-channel data set reagardin Chronic Lymphocitic Leukemia, LIMMA finds no significant probe, SAM finds 23 significantly changed probes but cannot separate the two classes, while MultiSAM finds 122 probes with score >300 and separates the data into two clusters by hierarchical clustering. We also report extra-assay validation in terms of differentially expressed genes Although standard algorithms perform well over low-noise simulated data sets, multi-SAM seems to be the only one able to reveal subtle differences in gene expression profiles on real unbalanced data. In Chapter 4 a method to adress similarities evaluation in a three-class prblem by means of Relevance Vector Machine [4] is described. In fact, looking at microarray data in a prognostic and diagnostic clinical framework, not only differences could have a crucial role. In some cases similarities can give useful and, sometimes even more, important information. The goal, given three classes, could be to establish, with a certain level of confidence, if the third one is similar to the first or the second one. In this work we show that Relevance Vector Machine (RVM) [2] could be a possible solutions to the limitation of standard supervised classification. In fact, RVM offers many advantages compared, for example, with his well-known precursor (Support Vector Machine - SVM [3]). Among these advantages, the estimate of posterior probability of class membership represents a key feature to address the similarity issue. This is a highly important, but often overlooked, option of any practical pattern recognition system. We focused on Tumor-Grade-three-class problem, so we have 67 samples of grade I (G1), 54 samples of grade 3 (G3) and 100 samples of grade 2 (G2). The goal is to find a model able to separate G1 from G3, then evaluate the third class G2 as test-set to obtain the probability for samples of G2 to be member of class G1 or class G3. The analysis showed that breast cancer samples of grade II have a molecular profile more similar to breast cancer samples of grade I. Looking at the literature this result have been guessed, but no measure of significance was gived before.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The inherent stochastic character of most of the physical quantities involved in engineering models has led to an always increasing interest for probabilistic analysis. Many approaches to stochastic analysis have been proposed. However, it is widely acknowledged that the only universal method available to solve accurately any kind of stochastic mechanics problem is Monte Carlo Simulation. One of the key parts in the implementation of this technique is the accurate and efficient generation of samples of the random processes and fields involved in the problem at hand. In the present thesis an original method for the simulation of homogeneous, multi-dimensional, multi-variate, non-Gaussian random fields is proposed. The algorithm has proved to be very accurate in matching both the target spectrum and the marginal probability. The computational efficiency and robustness are very good too, even when dealing with strongly non-Gaussian distributions. What is more, the resulting samples posses all the relevant, welldefined and desired properties of “translation fields”, including crossing rates and distributions of extremes. The topic of the second part of the thesis lies in the field of non-destructive parametric structural identification. Its objective is to evaluate the mechanical characteristics of constituent bars in existing truss structures, using static loads and strain measurements. In the cases of missing data and of damages that interest only a small portion of the bar, Genetic Algorithm have proved to be an effective tool to solve the problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Technology scaling increasingly emphasizes complexity and non-ideality of the electrical behavior of semiconductor devices and boosts interest on alternatives to the conventional planar MOSFET architecture. TCAD simulation tools are fundamental to the analysis and development of new technology generations. However, the increasing device complexity is reflected in an augmented dimensionality of the problems to be solved. The trade-off between accuracy and computational cost of the simulation is especially influenced by domain discretization: mesh generation is therefore one of the most critical steps and automatic approaches are sought. Moreover, the problem size is further increased by process variations, calling for a statistical representation of the single device through an ensemble of microscopically different instances. The aim of this thesis is to present multi-disciplinary approaches to handle this increasing problem dimensionality in a numerical simulation perspective. The topic of mesh generation is tackled by presenting a new Wavelet-based Adaptive Method (WAM) for the automatic refinement of 2D and 3D domain discretizations. Multiresolution techniques and efficient signal processing algorithms are exploited to increase grid resolution in the domain regions where relevant physical phenomena take place. Moreover, the grid is dynamically adapted to follow solution changes produced by bias variations and quality criteria are imposed on the produced meshes. The further dimensionality increase due to variability in extremely scaled devices is considered with reference to two increasingly critical phenomena, namely line-edge roughness (LER) and random dopant fluctuations (RD). The impact of such phenomena on FinFET devices, which represent a promising alternative to planar CMOS technology, is estimated through 2D and 3D TCAD simulations and statistical tools, taking into account matching performance of single devices as well as basic circuit blocks such as SRAMs. Several process options are compared, including resist- and spacer-defined fin patterning as well as different doping profile definitions. Combining statistical simulations with experimental data, potentialities and shortcomings of the FinFET architecture are analyzed and useful design guidelines are provided, which boost feasibility of this technology for mainstream applications in sub-45 nm generation integrated circuits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Animal models have been relevant to study the molecular mechanisms of cancer and to develop new antitumor agents. Anyway, the huge divergence in mouse and human evolution made difficult the translation of the gained achievements in preclinical mouse based studies. The generation of clinically relevant murine models requires their humanization both concerning the creation of transgenic models and the generation of humanized mice in which to engraft a functional human immune system, and reproduce the physiological effects and molecular mechanisms of growth and metastasization of human tumors. In particular, the availability of genotypically stable immunodepressed mice able to accept tumor injection and allow human tumor growth and metastasization would be important to develop anti-tumor and anti-metastatic strategies. Recently, Rag2-/-;gammac-/- mice, double knockout for genes involved in lymphocyte differentiation, had been developed (CIEA, Central Institute for Experimental Animals, Kawasaki, Japan). Studies of human sarcoma metastasization in Rag2-/-; gammac-/- mice (lacking B, T and NK functionality) revealed their high metastatic efficiency and allowed the expression of human metastatic phenotypes not detectable in the conventionally used nude murine model. In vitro analysis to investigate the molecular mechanisms involved in the specific pattern of human sarcomas metastasization revealed the importance of liver-produced growth and motility factors, in particular the insulin-like growth factors (IGFs). The involvement of this growth factor was then demonstrated in vivo through inhibition of IGF signalling pathway. Due to the high growth and metastatic propensity of tumor cells, Rag2-/-;gammac-/- mice were used as model to investigate the metastatic behavior of rhabdomyosarcoma cells engineered to improve the differentiation. It has been recently shown that this immunodeficient model can be reconstituted with a human immune system through the injection of human cord blood progenitor cells. The work illustrated in this thesis revealed that the injection of different human progenitor cells (CD34+ or CD133+) showed peculiar engraftment and differentiation abilities. Experiments of cell vaccination were performed to investigate the functionality of the engrafted human immune system and the induction of specific human immune responses. Results from such experiments will allow to collect informations about human immune responses activated during cell vaccination and to define the best reconstitution and experimental conditions to create a humanized model in which to study, in a preclinical setting, immunological antitumor strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The focus of this research is to develop and apply an analytical framework for evaluating the effectiveness and practicability of sustainability certification schemes for biofuels, especially in a developing country’s perspective. The main question that drives the research analysis is “Which are the main elements of and how to develop sustainability certification schemes that would be effective and practicable in certifying the contribution of biofuels in meeting the goals Governments and other stakeholders have set up?”. Biofuels have been identified as a promising tool to reach a variety of goals: climate change protection, energy security, agriculture development, and, especially in developing countries, economic development. Once the goals have been identified, and ambitious mandatory targets for biofuels use agreed at national level, concerns have been raised by the scientific community on the negative externalities that biofuels production and use can have at environment, social and economic level. Therefore certification schemes have been recognized as necessary processes to measure these externalities, and examples of such schemes are in effect, or are in a negotiating phase, both at mandatory and voluntary levels. The research focus has emerged by the concern that the ongoing examples are very demanding in terms of compliance, both for those that are subject to certification and those that have to certify, on the quantity and quality of information to be reported. A certification system, for reasons linked to costs, lack of expertise, inadequate infrastructure, absence of an administrative and legislative support, can represent an intensive burden and can act as a serious impediment for the industrial and agriculture development of developing countries, going against the principle of equity and level playing field. While this research recognizes the importance of comprehensiveness and ambition in designing an important tool for the measurement of sustainability effects of biofuels production and use, it stresses the need to focus on the effectiveness and practicability of this tool in measuring the compliance with the goal. This research that falls under the rationale of the Sustainability Science Program housed at Harvard Kennedy School, has as main objective to close the gap between the research and policy makers worlds in the field of sustainability certification schemes for biofuels.