918 resultados para Models in art


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Natural disasters are events that cause general and widespread destruction of the built environment and are becoming increasingly recurrent. They are a product of vulnerability and community exposure to natural hazards, generating a multitude of social, economic and cultural issues of which the loss of housing and the subsequent need for shelter is one of its major consequences. Nowadays, numerous factors contribute to increased vulnerability and exposure to natural disasters such as climate change with its impacts felt across the globe and which is currently seen as a worldwide threat to the built environment. The abandonment of disaster-affected areas can also push populations to regions where natural hazards are felt more severely. Although several actors in the post-disaster scenario provide for shelter needs and recovery programs, housing is often inadequate and unable to resist the effects of future natural hazards. Resilient housing is commonly not addressed due to the urgency in sheltering affected populations. However, by neglecting risks of exposure in construction, houses become vulnerable and are likely to be damaged or destroyed in future natural hazard events. That being said it becomes fundamental to include resilience criteria, when it comes to housing, which in turn will allow new houses to better withstand the passage of time and natural disasters, in the safest way possible. This master thesis is intended to provide guiding principles to take towards housing recovery after natural disasters, particularly in the form of flood resilient construction, considering floods are responsible for the largest number of natural disasters. To this purpose, the main structures that house affected populations were identified and analyzed in depth. After assessing the risks and damages that flood events can cause in housing, a methodology was proposed for flood resilient housing models, in which there were identified key criteria that housing should meet. The same methodology is based in the US Federal Emergency Management Agency requirements and recommendations in accordance to specific flood zones. Finally, a case study in Maldives – one of the most vulnerable countries to sea level rise resulting from climate change – has been analyzed in light of housing recovery in a post-disaster induced scenario. This analysis was carried out by using the proposed methodology with the intent of assessing the resilience of the newly built housing to floods in the aftermath of the 2004 Indian Ocean Tsunami.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research uses the textile/text axis concept as a conceptual tool to investigate the role of textile and text in contemporary women’s art practice and theorizing, investigating textile as a largely hitherto unacknowledged element in women’s art practice of the late 20th and early 21st centuries. Textile and text share a common etymological root, from the Latin textere to weave, textus a fabric. The thesis illuminates the pathways whereby textile and text played an important role in women reclaiming a speaking voice as creators of culture and signification during a revolutionary period of renewal in women’s cultural contribution and positioning. The methodological approach used in the research consisted of a comprehensive literature review, the compilation of an inventory of relevant women artists, developing a classificatory system differentiating types of approaches, concerns and concepts underpinning women’s art practice vis a vis the textile/text axis and a series of three in-depth case studies of artists Tracey Emin, Louise Bourgeois and Faith Ringgold. The thesis points to the fact that contemporary women artists and theorists have rounded their art practice and aesthetic discourse in textile as prime visual metaphor and signifier, turning towards the ancient language of textile not merely to reclaim a speaking voice but to occupy a ground breaking locus of signification and representation in contemporary culture. The textile/text axis facilitated women artists in powerfully countering a culturally inscribed status of Lacanian ‘no-woman’ (a position of abjection, absence and lack in the phallocentric symbolic). Turning towards a language of aeons, textile as fertile wellspring, the thesis identifies the methodologies and strategies whereby women artists have inserted their webs of subjectivities and deepest concerns into the records and discourses of contemporary culture. Presenting an anatomy of the textile/text axis, the thesis identifies nine component elements manifesting in contemporary women’s aesthetic practice and discourse. In this cultural renaissance, the textile/text axis, the thesis suggests, served as a complex lexicon, a system of labyrinthine references and signification, a site of layered meanings and ambiguities, a body proxy and a corporeal cartography, facilitating a revolution in women’s aesthetic praxis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When actuaries face with the problem of pricing an insurance contract that contains different types of coverage, such as a motor insurance or homeowner's insurance policy, they usually assume that types of claim are independent. However, this assumption may not be realistic: several studies have shown that there is a positive correlation between types of claim. Here we introduce different regression models in order to relax the independence assumption, including zero-inflated models to account for excess of zeros and overdispersion. These models have been largely ignored to multivariate Poisson date, mainly because of their computational di±culties. Bayesian inference based on MCMC helps to solve this problem (and also lets us derive, for several quantities of interest, posterior summaries to account for uncertainty). Finally, these models are applied to an automobile insurance claims database with three different types of claims. We analyse the consequences for pure and loaded premiums when the independence assumption is relaxed by using different multivariate Poisson regression models and their zero-inflated versions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chagas disease, a neglected illness, affects nearly 12-14 million people in endemic areas of Latin America. Although the occurrence of acute cases sharply has declined due to Southern Cone Initiative efforts to control vector transmission, there still remain serious challenges, including the maintenance of sustainable public policies for Chagas disease control and the urgent need for better drugs to treat chagasic patients. Since the introduction of benznidazole and nifurtimox approximately 40 years ago, many natural and synthetic compounds have been assayed against Trypanosoma cruzi, yet only a few compounds have advanced to clinical trials. This reflects, at least in part, the lack of consensus regarding appropriate in vitro and in vivo screening protocols as well as the lack of biomarkers for treating parasitaemia. The development of more effective drugs requires (i) the identification and validation of parasite targets, (ii) compounds to be screened against the targets or the whole parasite and (iii) a panel of minimum standardised procedures to advance leading compounds to clinical trials. This third aim was the topic of the workshop entitled Experimental Models in Drug Screening and Development for Chagas Disease, held in Rio de Janeiro, Brazil, on the 25th and 26th of November 2008 by the Fiocruz Program for Research and Technological Development on Chagas Disease and Drugs for Neglected Diseases Initiative. During the meeting, the minimum steps, requirements and decision gates for the determination of the efficacy of novel drugs for T. cruzi control were evaluated by interdisciplinary experts and an in vitro and in vivo flowchart was designed to serve as a general and standardised protocol for screening potential drugs for the treatment of Chagas disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Estudi realitzat a partir d’una estada a la Stanford University School of Medicine. Division of Radiation Oncology, Estats Units, entre 2010 i 2012. Durant els dos anys de beca postdoctoral he estat treballant en dos projectes diferents. En primer lloc, i com a continuació d'estudis previs del grup, volíem estudiar la causa de les diferències en nivells d'hipòxia que havíem observat en models de càncer de pulmó. La nostra hipòtesi es basava en el fet que aquestes diferències es devien a la funcionalitat de la vasculatura. Vam utilitzar dos models preclínics: un en què els tumors es formaven espontàniament als pulmons i l'altre on nosaltres injectàvem les cèl•lules de manera subcutània. Vam utilitzar tècniques com la ressonància magnètica dinàmica amb agent de contrast (DCE-MRI) i l'assaig de perfusió amb el Hoeschst 33342 i ambdues van demostrar que la funcionalitat de la vasculatura dels tumors espontanis era molt més elevada comparada amb la dels tumors subcutanis. D'aquest estudi, en podem concloure que les diferències en els nivells d'hipòxia en els diferents models tumorals de càncer de pulmó podrien ser deguts a la variació en la formació i funcionalitat de la vasculatura. Per tant, la selecció de models preclínics és essencial, tant pels estudi d'hipòxia i angiogènesi, com per a teràpies adreçades a aquests fenòmens. L'altre projecte que he estat desenvolupant es basa en l'estudi de la radioteràpia i els seus possibles efectes a l’hora de potenciar l'autoregeneració del tumor a partir de les cèl•lules tumorals circulants (CTC). Aquest efecte s'ha descrit en alguns models tumorals preclínics. Per tal de dur a terme els nostres estudis, vam utilitzar una línia tumoral de càncer de mama de ratolí, marcada permanentment amb el gen de Photinus pyralis o sense marcar i vam fer estudis in vitro i in vivo. Ambdós estudis han demostrat que la radiació tumoral promou la invasió cel•lular i l'autoregeneració del tumor per CTC. Aquest descobriment s'ha de considerar dins d'un context de radioteràpia clínica per tal d'aconseguir el millor tractament en pacients amb nivells de CTC elevats.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background In a previous study, the European Organisation for Research and Treatment of Cancer (EORTC) reported a scoring system to predict survival of patients with low-grade gliomas (LGGs). A major issue in the diagnosis of brain tumors is the lack of agreement among pathologists. New models in patients with LGGs diagnosed by central pathology review are needed. Methods Data from 339 EORTC patients with LGGs diagnosed by central pathology review were used to develop new prognostic models for progression-free survival (PFS) and overall survival (OS). Data from 450 patients with centrally diagnosed LGGs recruited into 2 large studies conducted by North American cooperative groups were used to validate the models. Results Both PFS and OS were negatively influenced by the presence of baseline neurological deficits, a shorter time since first symptoms (<30 wk), an astrocytic tumor type, and tumors larger than 5 cm in diameter. Early irradiation improved PFS but not OS. Three risk groups have been identified (low, intermediate, and high) and validated. Conclusions We have developed new prognostic models in a more homogeneous LGG population diagnosed by central pathology review. This population better fits with modern practice, where patients are enrolled in clinical trials based on central or panel pathology review. We could validate the models in a large, external, and independent dataset. The models can divide LGG patients into 3 risk groups and provide reliable individual survival predictions. Inclusion of other clinical and molecular factors might still improve models' predictions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, both homing endonucleases (HEases) and zinc-finger nucleases (ZFNs) have been engineered and selected for the targeting of desired human loci for gene therapy. However, enzyme engineering is lengthy and expensive and the off-target effect of the manufactured endonucleases is difficult to predict. Moreover, enzymes selected to cleave a human DNA locus may not cleave the homologous locus in the genome of animal models because of sequence divergence, thus hampering attempts to assess the in vivo efficacy and safety of any engineered enzyme prior to its application in human trials. Here, we show that naturally occurring HEases can be found, that cleave desirable human targets. Some of these enzymes are also shown to cleave the homologous sequence in the genome of animal models. In addition, the distribution of off-target effects may be more predictable for native HEases. Based on our experimental observations, we present the HomeBase algorithm, database and web server that allow a high-throughput computational search and assignment of HEases for the targeting of specific loci in the human and other genomes. We validate experimentally the predicted target specificity of candidate fungal, bacterial and archaeal HEases using cell free, yeast and archaeal assays.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

My research in live drawing and new technologies uses a combination of a human figure in live in composition, overlaid with a digital projection of a second human figure. The aim is to explore, to amplify and thoroughly analyse the search for distinctive identities and graphic languages of representation for live and projected models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We numerically study the dynamical properties of fully frustrated models in two and three dimensions. The results obtained support the hypothesis that the percolation transition of the Kasteleyn-Fortuin clusters corresponds to the onset of stretched exponential autocorrelation functions in systems without disorder. This dynamical behavior may be due to the large scale effects of frustration, present below the percolation threshold. Moreover, these results are consistent with the picture suggested by Campbell et al. [J. Phys. C 20, L47 (1987)] in the space of configurations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Glioblastoma multiforme (GBM) is the most common and lethal of all gliomas. The current standard of care includes surgery followed by concomitant radiation and chemotherapy with the DNA alkylating agent temozolomide (TMZ). O⁶-methylguanine-DNA methyltransferase (MGMT) repairs the most cytotoxic of lesions generated by TMZ, O⁶-methylguanine. Methylation of the MGMT promoter in GBM correlates with increased therapeutic sensitivity to alkylating agent therapy. However, several aspects of TMZ sensitivity are not explained by MGMT promoter methylation. Here, we investigated our hypothesis that the base excision repair enzyme alkylpurine-DNA-N-glycosylase (APNG), which repairs the cytotoxic lesions N³-methyladenine and N⁷-methylguanine, may contribute to TMZ resistance. Silencing of APNG in established and primary TMZ-resistant GBM cell lines endogenously expressing MGMT and APNG attenuated repair of TMZ-induced DNA damage and enhanced apoptosis. Reintroducing expression of APNG in TMZ-sensitive GBM lines conferred resistance to TMZ in vitro and in orthotopic xenograft mouse models. In addition, resistance was enhanced with coexpression of MGMT. Evaluation of APNG protein levels in several clinical datasets demonstrated that in patients, high nuclear APNG expression correlated with poorer overall survival compared with patients lacking APNG expression. Loss of APNG expression in a subset of patients was also associated with increased APNG promoter methylation. Collectively, our data demonstrate that APNG contributes to TMZ resistance in GBM and may be useful in the diagnosis and treatment of the disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Depth-averaged velocities and unit discharges within a 30 km reach of one of the world's largest rivers, the Rio Parana, Argentina, were simulated using three hydrodynamic models with different process representations: a reduced complexity (RC) model that neglects most of the physics governing fluid flow, a two-dimensional model based on the shallow water equations, and a three-dimensional model based on the Reynolds-averaged Navier-Stokes equations. Row characteristics simulated using all three models were compared with data obtained by acoustic Doppler current profiler surveys at four cross sections within the study reach. This analysis demonstrates that, surprisingly, the performance of the RC model is generally equal to, and in some instances better than, that of the physics based models in terms of the statistical agreement between simulated and measured flow properties. In addition, in contrast to previous applications of RC models, the present study demonstrates that the RC model can successfully predict measured flow velocities. The strong performance of the RC model reflects, in part, the simplicity of the depth-averaged mean flow patterns within the study reach and the dominant role of channel-scale topographic features in controlling the flow dynamics. Moreover, the very low water surface slopes that typify large sand-bed rivers enable flow depths to be estimated reliably in the RC model using a simple fixed-lid planar water surface approximation. This approach overcomes a major problem encountered in the application of RC models in environments characterised by shallow flows and steep bed gradients. The RC model is four orders of magnitude faster than the physics based models when performing steady-state hydrodynamic calculations. However, the iterative nature of the RC model calculations implies a reduction in computational efficiency relative to some other RC models. A further implication of this is that, if used to simulate channel morphodynamics, the present RC model may offer only a marginal advantage in terms of computational efficiency over approaches based on the shallow water equations. These observations illustrate the trade off between model realism and efficiency that is a key consideration in RC modelling. Moreover, this outcome highlights a need to rethink the use of RC morphodynamic models in fluvial geomorphology and to move away from existing grid-based approaches, such as the popular cellular automata (CA) models, that remain essentially reductionist in nature. In the case of the world's largest sand-bed rivers, this might be achieved by implementing the RC model outlined here as one element within a hierarchical modelling framework that would enable computationally efficient simulation of the morphodynamics of large rivers over millennial time scales. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Indirect topographic variables have been used successfully as surrogates for disturbance processes in plant species distribution models (SDM) in mountain environments. However, no SDM studies have directly tested the performance of disturbance variables. In this study, we developed two disturbance variables: a geomorphic index (GEO) and an index of snow redistribution by wind (SNOW). These were developed in order to assess how they improved both the fit and predictive power of presenceabsence SDM based on commonly used topoclimatic (TC) variables for 91 plants in the Western Swiss Alps. The individual contribution of the disturbance variables was compared to TC variables. Maps of models were prepared to spatially test the effect of disturbance variables. On average, disturbance variables significantly improved the fit but not the predictive power of the TC models and their individual contribution was weak (5.6% for GEO and 3.3% for SNOW). However their maximum individual contribution was important (24.7% and 20.7%). Finally, maps including disturbance variables (i) were significantly divergent from TC models in terms of predicted suitable surfaces and connectivity between potential habitats, and (ii) were interpreted as more ecologically relevant. Disturbance variables did not improve the transferability of models at the local scale in a complex mountain system, and the performance and contribution of these variables were highly species-specific. However, improved spatial projections and change in connectivity are important issues when preparing projections under climate change because the future range size of the species will determine the sensitivity to changing conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Yksi keskeisimmistä tehtävistä matemaattisten mallien tilastollisessa analyysissä on mallien tuntemattomien parametrien estimointi. Tässä diplomityössä ollaan kiinnostuneita tuntemattomien parametrien jakaumista ja niiden muodostamiseen sopivista numeerisista menetelmistä, etenkin tapauksissa, joissa malli on epälineaarinen parametrien suhteen. Erilaisten numeeristen menetelmien osalta pääpaino on Markovin ketju Monte Carlo -menetelmissä (MCMC). Nämä laskentaintensiiviset menetelmät ovat viime aikoina kasvattaneet suosiotaan lähinnä kasvaneen laskentatehon vuoksi. Sekä Markovin ketjujen että Monte Carlo -simuloinnin teoriaa on esitelty työssä siinä määrin, että menetelmien toimivuus saadaan perusteltua. Viime aikoina kehitetyistä menetelmistä tarkastellaan etenkin adaptiivisia MCMC menetelmiä. Työn lähestymistapa on käytännönläheinen ja erilaisia MCMC -menetelmien toteutukseen liittyviä asioita korostetaan. Työn empiirisessä osuudessa tarkastellaan viiden esimerkkimallin tuntemattomien parametrien jakaumaa käyttäen hyväksi teoriaosassa esitettyjä menetelmiä. Mallit kuvaavat kemiallisia reaktioita ja kuvataan tavallisina differentiaaliyhtälöryhminä. Mallit on kerätty kemisteiltä Lappeenrannan teknillisestä yliopistosta ja Åbo Akademista, Turusta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Optimization models in metabolic engineering and systems biology focus typically on optimizing a unique criterion, usually the synthesis rate of a metabolite of interest or the rate of growth. Connectivity and non-linear regulatory effects, however, make it necessary to consider multiple objectives in order to identify useful strategies that balance out different metabolic issues. This is a fundamental aspect, as optimization of maximum yield in a given condition may involve unrealistic values in other key processes. Due to the difficulties associated with detailed non-linear models, analysis using stoichiometric descriptions and linear optimization methods have become rather popular in systems biology. However, despite being useful, these approaches fail in capturing the intrinsic nonlinear nature of the underlying metabolic systems and the regulatory signals involved. Targeting more complex biological systems requires the application of global optimization methods to non-linear representations. In this work we address the multi-objective global optimization of metabolic networks that are described by a special class of models based on the power-law formalism: the generalized mass action (GMA) representation. Our goal is to develop global optimization methods capable of efficiently dealing with several biological criteria simultaneously. In order to overcome the numerical difficulties of dealing with multiple criteria in the optimization, we propose a heuristic approach based on the epsilon constraint method that reduces the computational burden of generating a set of Pareto optimal alternatives, each achieving a unique combination of objectives values. To facilitate the post-optimal analysis of these solutions and narrow down their number prior to being tested in the laboratory, we explore the use of Pareto filters that identify the preferred subset of enzymatic profiles. We demonstrate the usefulness of our approach by means of a case study that optimizes the ethanol production in the fermentation of Saccharomyces cerevisiae.