887 resultados para flood extent mapping
Resumo:
Whilst policy makers have tended to adopt an ‘information-deficit model’ to bolster levels of flood-risk preparedness primarily though communication strategies promoting awareness, the assumed causal relation between awareness and preparedness is empirically weak. As such, there is a growing interest amongst scholars and policy makers alike to better understand why at-risk individuals are underprepared. In this vein, empirical studies, typically employing quantitative methods, have tended to focus on exploring the extent to which flood-risk preparedness levels vary depending not only on socio-demographic variables, but also (and increasingly so) the perceptual factors that influence flood risk preparedness. This study builds upon and extends this body of research by offering a more solution-focused approach that seeks to identify how pathways to flood-risk preparedness can be opened up. Specifically, through application of a qualitative methodology, we seek to explore how the factors that negatively influence flood-risk preparedness can be addressed to foster a shift towards greater levels of mitigation behaviour. In doing so, we focus our analysis on an urban community in Ireland that is identified as ‘at risk’ of flash flooding and is currently undergoing significant flood relief works. In this regard, the case study offers an interesting laboratory to explore how attitudes towards flood-risk preparedness at the individual level are being influenced within the context of a flood relief scheme that is only partially constructed. In order to redress the dearth of theoretically informed qualitative studies in this field, we draw on Protection Motivation Theory (PMT) to help guide our analysis and make sense of our results. Our findings demonstrate that flood-risk preparedness can be undermined by low levels of efficacy amongst individuals in terms of the preparedness measures available to them and their own personal capacity to implement them. We also elucidate that the ‘levee effect’ can occur before engineered flood defences are fully constructed as the flood relief works within our case study are beginning to affect people’s perception of flood risk in the case study area. We conclude by arguing that 1) individuals’ coping appraisals need to be enhanced through communication strategies and other interventions which highlight that future floods may not replicate past events; and 2) the concept of residual risk needs to be communicated at all stages of a flood relief scheme, not just upon completion.
Resumo:
Previous research claims that there has been a narrowing of distance between the Swedish political parties. Typically, such research into political distance has primarily focused on studying voters rather than the political parties themselves. In this article, the author conducts a longitudinal analysis of Comparative Manifesto Project data to determine if, and to what extent, the political parties have converged ideologically on a Left-Right continuum in the period 1991-2010. After first unraveling the concept of political distance, the author moves on to explain why the ideological dispersion of political parties is an important and consequential characteristic within party systems. Furthermore, the author argues that the Left-Right ideological scale continues to be a highly useful model with which to conceptualize and study this characteristic. The author then discusses the methodological approach and explains why quantitative manifesto data, often overlooked in favor of voter interview data, is deemed a valid and reliable material for measuring the ideological positions of political parties. The findings are that there indeed have been over all tendencies of ideological convergence between the blocs and that, in terms of how political parties are dispersed on a Left- Right ideological continuum, by 2010, the Swedish party system (the Sweden Democrats excluded) had become much less polarized than it had been in 1991.
Resumo:
Surface flow types (SFT) are advocated as ecologically relevant hydraulic units, often mapped visually from the bankside to characterise rapidly the physical habitat of rivers. SFT mapping is simple, non-invasive and cost-efficient. However, it is also qualitative, subjective and plagued by difficulties in recording accurately the spatial extent of SFT units. Quantitative validation of the underlying physical habitat parameters is often lacking, and does not consistently differentiate between SFTs. Here, we investigate explicitly the accuracy, reliability and statistical separability of traditionally mapped SFTs as indicators of physical habitat, using independent, hydraulic and topographic data collected during three surveys of a c. 50m reach of the River Arrow, Warwickshire, England. We also explore the potential of a novel remote sensing approach, comprising a small unmanned aerial system (sUAS) and Structure-from-Motion photogrammetry (SfM), as an alternative method of physical habitat characterisation. Our key findings indicate that SFT mapping accuracy is highly variable, with overall mapping accuracy not exceeding 74%. Results from analysis of similarity (ANOSIM) tests found that strong differences did not exist between all SFT pairs. This leads us to question the suitability of SFTs for characterising physical habitat for river science and management applications. In contrast, the sUAS-SfM approach provided high resolution, spatially continuous, spatially explicit, quantitative measurements of water depth and point cloud roughness at the microscale (spatial scales ≤1m). Such data are acquired rapidly, inexpensively, and provide new opportunities for examining the heterogeneity of physical habitat over a range of spatial and temporal scales. Whilst continued refinement of the sUAS-SfM approach is required, we propose that this method offers an opportunity to move away from broad, mesoscale classifications of physical habitat (spatial scales 10-100m), and towards continuous, quantitative measurements of the continuum of hydraulic and geomorphic conditions which actually exists at the microscale.
Resumo:
Across Europe, citizens are increasingly expected to participate in the implementation of flood risk management (FRM), by engaging in voluntary-based activities to enhance preparedness, implementing property-level measures, and so forth. Although citizen participation in FRM decision making is widely addressed in academic literature, citizens’ involvement in the delivery of FRM measures is comparatively understudied. Drawing from public administration literature, we adopted the notion of “coproduction” as an analytical framework for studying the interaction between citizens and public authorities, from the decision-making process through to the implementation of FRM in practice. We considered to what extent coproduction is evident in selected European Union (EU) member states, drawing from research conducted within the EU project STAR-FLOOD (Strengthening and Redesigning European Flood Risk Practices towards Appropriate and Resilient Flood Risk Governance Arrangements). On the basis of a cross-country comparison between Flanders (Belgium), England (United Kingdom), France, the Netherlands, and Poland, we have highlighted the varied forms of coproduction and reflected on how these have been established within divergent settings. Coproduction is most prominent in discourse and practice in England and is emergent in France and Flanders. By contrast, FRM in the Netherlands and Poland remains almost exclusively reliant on governmental protection measures and thereby consultation-based forms of coproduction. Analysis revealed how these actions are motivated by different underlying rationales, which in turn shape the type of approaches and degree of institutionalization of coproduction. In the Netherlands, coproduction is primarily encouraged to increase societal resilience, whereas public authorities in the other countries also use it to improve cost-efficiency and redistribute responsibilities to its beneficiaries.
Resumo:
Optical mapping of voltage signals has revolutionised the field and study of cardiac electrophysiology by providing the means to visualise changes in electrical activity at a high temporal and spatial resolution from the cellular to the whole heart level under both normal and disease conditions. The aim of this thesis was to develop a novel method of panoramic optical mapping using a single camera and to study myocardial electrophysiology in isolated Langendorff-perfused rabbit hearts. First, proper procedures for selection, filtering and analysis of the optical data recorded from the panoramic optical mapping system were established. This work was followed by extensive characterisation of the electrical activity across the epicardial surface of the preparation investigating time and heart dependent effects. In an initial study, features of epicardial electrophysiology were examined as the temperature of the heart was reduced below physiological values. This manoeuvre was chosen to mimic the temperatures experienced during various levels of hypothermia in vivo, a condition known to promote arrhythmias. The facility for panoramic optical mapping allowed the extent of changes in conduction timing and pattern of ventricular activation and repolarisation to be assessed. In the main experimental section, changes in epicardial electrical activity were assessed under various pacing conditions in both normal hearts and in a rabbit model of chronic MI. In these experiments, there was significant changes in the pattern of electrical activation corresponding with the changes in pacing regime. These experiments demonstrated a negative correlation between activation time and APD, which was not maintained during ventricular pacing. This suggests that activation pattern is not the sole determinant of action potential duration in intact hearts. Lastly, a realistic 3D computational model of the rabbit left ventricle was developed to simulate the passive and active mechanical properties of the heart. The aim of this model was to infer further information from the experimental optical mapping studies. In future, it would be feasible to gain insight into the electrical and mechanical performance of the heart by simulating experimental pacing conditions in the model.
Resumo:
Cauliflower (Brassica oleracea var. botrytis) is a vernalization-responsive crop. High ambient temperatures delay harvest time. The elucidation of the genetic regulation of floral transition is highly interesting for a precise harvest scheduling and to ensure stable market supply. This study aims at genetic dissection of temperature-dependent curd induction in cauliflower by genome-wide association studies and gene expression analysis. To assess temperature dependent curd induction, two greenhouse trials under distinct temperature regimes were conducted on a diversity panel consisting of 111 cauliflower commercial parent lines, genotyped with 14,385 SNPs. Broad phenotypic variation and high heritability (0.93) were observed for temperature-related curd induction within the cauliflower population. GWA mapping identified a total of 18 QTL localized on chromosomes O1, O2, O3, O4, O6, O8, and O9 for curding time under two distinct temperature regimes. Among those, several QTL are localized within regions of promising candidate flowering genes. Inferring population structure and genetic relatedness among the diversity set assigned three main genetic clusters. Linkage disequilibrium (LD) patterns estimated global LD extent of r(2) = 0.06 and a maximum physical distance of 400 kb for genetic linkage. Transcriptional profiling of flowering genes FLOWERING LOCUS C (BoFLC) and VERNALIZATION 2 (BoVRN2) was performed, showing increased expression levels of BoVRN2 in genotypes with faster curding. However, functional relevance of BoVRN2 and BoFLC2 could not consistently be supported, which probably suggests to act facultative and/or might evidence for BoVRN2/BoFLC-independent mechanisms in temperature regulated floral transition in cauliflower. Genetic insights in temperature-regulated curd induction can underpin genetically informed phenology models and benefit molecular breeding strategies toward the development of thermo-tolerant cultivars.
Resumo:
This thesis addresses the entanglements between the Namibian liberation struggle and the global Cold War, focusing on the socialist support provided to the South West African People Organization (SWAPO), the liberation movement that fought for the independence of the country from the South African regime. This thesis aims at analyzing three socialist models of solidarity with the SWAPO’s struggle that developed especially from the late 1970s. Combining archival sources and biographical accounts, it examines the politics of solidarity with SWAPO implemented by East Germany, Cuba, and the Italian Communist Party. The interest lies in understanding how solidarity was declined and received by internal promoters and external addressees. Thus, I explore how these three actors constructed their concept of solidarity with SWAPO according to their national and ideological contexts and how this was received by the SWAPO members who experienced it in various ways. Each socialist actor promoted solidarity with SWAPO by using varying narratives, pursuing their own objectives, and employing diverse instruments, thus carrying out different and sometimes competing visions of socialism and solidarity. On its side, SWAPO was able to take advantage from such visions, as each of them could serve its different needs in diverse ways. In providing a general overview of these three solidarity policies, this thesis has the objective of highlighting the internal pluralization of the “socialist solidarity regime” while at the same time contributing to the debate on the extent of SWAPO’s commitment to socialism during the Namibian liberation struggle. It argues that, while pragmatism has always guided SWAPO during the liberation struggle and the post-independence period, and non-alignment has always been its international stance, socialism has to some extent been a model for the revolution in Namibia, to the point that it is still influencing the SWAPO party today.
Resumo:
The increasing number of extreme rainfall events, combined with the high population density and the imperviousness of the land surface, makes urban areas particularly vulnerable to pluvial flooding. In order to design and manage cities to be able to deal with this issue, the reconstruction of weather phenomena is essential. Among the most interesting data sources which show great potential are the observational networks of private sensors managed by citizens (crowdsourcing). The number of these personal weather stations is consistently increasing, and the spatial distribution roughly follows population density. Precisely for this reason, they perfectly suit this detailed study on the modelling of pluvial flood in urban environments. The uncertainty associated with these measurements of precipitation is still a matter of research. In order to characterise the accuracy and precision of the crowdsourced data, we carried out exploratory data analyses. A comparison between Netatmo hourly precipitation amounts and observations of the same quantity from weather stations managed by national weather services is presented. The crowdsourced stations have very good skills in rain detection but tend to underestimate the reference value. In detail, the accuracy and precision of crowd- sourced data change as precipitation increases, improving the spread going to the extreme values. Then, the ability of this kind of observation to improve the prediction of pluvial flooding is tested. To this aim, the simplified raster-based inundation model incorporated in the Saferplaces web platform is used for simulating pluvial flooding. Different precipitation fields have been produced and tested as input in the model. Two different case studies are analysed over the most densely populated Norwegian city: Oslo. The crowdsourced weather station observations, bias-corrected (i.e. increased by 25%), showed very good skills in detecting flooded areas.
Resumo:
Disconnectivity between the Default Mode Network (DMN) nodes can cause clinical symptoms and cognitive deficits in Alzheimer׳s disease (AD). We aimed to examine the structural connectivity between DMN nodes, to verify the extent in which white matter disconnection affects cognitive performance. MRI data of 76 subjects (25 mild AD, 21 amnestic Mild Cognitive Impairment subjects and 30 controls) were acquired on a 3.0T scanner. ExploreDTI software (fractional Anisotropy threshold=0.25 and the angular threshold=60°) calculated axial, radial, and mean diffusivities, fractional anisotropy and streamline count. AD patients showed lower fractional anisotropy (P=0.01) and streamline count (P=0.029), and higher radial diffusivity (P=0.014) than controls in the cingulum. After correction for white matter atrophy, only fractional anisotropy and radial diffusivity remained significantly lower in AD compared to controls (P=0.003 and P=0.05). In the parahippocampal bundle, AD patients had lower mean and radial diffusivities (P=0.048 and P=0.013) compared to controls, from which only radial diffusivity survived for white matter adjustment (P=0.05). Regression models revealed that cognitive performance is also accounted for by white matter microstructural values. Structural connectivity within the DMN is important to the execution of high-complexity tasks, probably due to its relevant role in the integration of the network.
Resumo:
Dulce de leche samples available in the Brazilian market were submitted to sensory profiling by quantitative descriptive analysis and acceptance test, as well sensory evaluation using the just-about-right scale and purchase intent. External preference mapping and the ideal sensory characteristics of dulce de leche were determined. The results were also evaluated by principal component analysis, hierarchical cluster analysis, partial least squares regression, artificial neural networks, and logistic regression. Overall, significant product acceptance was related to intermediate scores of the sensory attributes in the descriptive test, and this trend was observed even after consumer segmentation. The results obtained by sensometric techniques showed that optimizing an ideal dulce de leche from the sensory standpoint is a multidimensional process, with necessary adjustments on the appearance, aroma, taste, and texture attributes of the product for better consumer acceptance and purchase. The optimum dulce de leche was characterized by high scores for the attributes sweet taste, caramel taste, brightness, color, and caramel aroma in accordance with the preference mapping findings. In industrial terms, this means changing the parameters used in the thermal treatment and quantitative changes in the ingredients used in formulations.
Resumo:
The evolution and population dynamics of avian coronaviruses (AvCoVs) remain underexplored. In the present study, in-depth phylogenetic and Bayesian phylogeographic studies were conducted to investigate the evolutionary dynamics of AvCoVs detected in wild and synanthropic birds. A total of 500 samples, including tracheal and cloacal swabs collected from 312 wild birds belonging to 42 species, were analysed using molecular assays. A total of 65 samples (13%) from 22 bird species were positive for AvCoV. Molecular evolution analyses revealed that the sequences from samples collected in Brazil did not cluster with any of the AvCoV S1 gene sequences deposited in the GenBank database. Bayesian framework analysis estimated an AvCoV strain from Sweden (1999) as the most recent common ancestor of the AvCoVs detected in this study. Furthermore, the analysis inferred an increase in the AvCoV dynamic demographic population in different wild and synanthropic bird species, suggesting that birds may be potential new hosts responsible for spreading this virus.
Resumo:
Mapping of elements in biological tissue by laser induced mass spectrometry is a fast growing analytical methodology in life sciences. This method provides a multitude of useful information of metal, nonmetal, metalloid and isotopic distribution at major, minor and trace concentration ranges, usually with a lateral resolution of 12-160 µm. Selected applications in medical research require an improved lateral resolution of laser induced mass spectrometric technique at the low micrometre scale and below. The present work demonstrates the applicability of a recently developed analytical methodology - laser microdissection associated to inductively coupled plasma mass spectrometry (LMD ICP-MS) - to obtain elemental images of different solid biological samples at high lateral resolution. LMD ICP-MS images of mouse brain tissue samples stained with uranium and native are shown, and a direct comparison of LMD and laser ablation (LA) ICP-MS imaging methodologies, in terms of elemental quantification, is performed.
Resumo:
This paper describes a new food classification which assigns foodstuffs according to the extent and purpose of the industrial processing applied to them. Three main groups are defined: unprocessed or minimally processed foods (group 1), processed culinary and food industry ingredients (group 2), and ultra-processed food products (group 3). The use of this classification is illustrated by applying it to data collected in the Brazilian Household Budget Survey which was conducted in 2002/2003 through a probabilistic sample of 48,470 Brazilian households. The average daily food availability was 1,792 kcal/person being 42.5% from group 1 (mostly rice and beans and meat and milk), 37.5% from group 2 (mostly vegetable oils, sugar, and flours), and 20% from group 3 (mostly breads, biscuits, sweets, soft drinks, and sausages). The share of group 3 foods increased with income, and represented almost one third of all calories in higher income households. The impact of the replacement of group 1 foods and group 2 ingredients by group 3 products on the overall quality of the diet, eating patterns and health is discussed.
Resumo:
QTL mapping provides usefull information for breeding programs since it allows the estimation of genomic locations and genetic effects of chromossomal regions related to the expression of quantitative traits. The objective of this study was to map QTL related to several agronomic important traits associated with grain yield: ear weight (EW), prolificacy (PROL), ear number (NE), ear length (EL) and diameter (ED), number of rows on the ear (NRE) and number of kernels per row on the ear (NKPR). Four hundred F-2:3 tropical maize progenies were evaluated in five environments in Piracicaba, Sao Paulo, Brazil. The genetic map was previously estimated and had 117 microssatelite loci with average distance of 14 cM. Data was analysed using Composite Interval Mapping for each trait. Thirty six QTL were mapped and related to the expression of EW (2), PROL (3), NE (2), EL (5), ED (5), NRE (10), NKPR (5). Few QTL were mapped since there was high GxE interaction. Traits EW, PROL and EN showed high genetic correlation with grain yield and several QTL mapped to similar genomic regions, which could cause the observed correlation. However, further analysis using apropriate statistical models are required to separate linked versus pleiotropic QTL. Five QTL (named Ew1, Ne1, Ed3, Nre3 and Nre10) had high genetic effects, explaining from 10.8% (Nre3) to 16.9% (Nre10) of the phenotypic variance, and could be considered in further studies.
Resumo:
The identification of alternatively spliced transcripts has contributed to a better comprehension of developmental mechanisms, tissue-specific physiological processes and human diseases. Polymerase chain reaction amplification of alternatively spliced variants commonly leads to the formation of heteroduplexes as a result of base pairing involving exons common between the two variants. S1 nuclease cleaves single-stranded loops of heteroduplexes and also nicks the opposite DNA strand. In order to establish a strategy for mapping alternative splice-prone sites in the whole transcriptome, we developed a method combining the formation of heteroduplexes between 2 distinct splicing variants and S1 nuclease digestion. For 20 consensuses identified here using this methodology, 5 revealed a conserved splice site after inspection of the cDNA alignment against the human genome (exact splice sites). For 8 other consensuses, conserved splice sites were mapped at 2 to 30 bp from the border, called proximal splice sites; for the other 7 consensuses, conserved splice sites were mapped at 40 to 800 bp, called distal splice sites. These latter cases showed a nonspecific activity of S1 nuclease in digesting double-strand DNA. From the 20 consensuses identified here, 5 were selected for reverse transcription-polymerase chain reaction validation, confirming the splice sites. These data showed the potential of the strategy in mapping splice sites. However, the lack of specificity of the S1 nuclease enzyme is a significant obstacle that impedes the use of this strategy in large-scale studies.