965 resultados para concept analysis
Territorial Cohesion through Spatial Policies: An Analysis with Cultural Theory and Clumsy Solutions
Resumo:
The European Territorial Cohesion Policy has been the subject of numerous debates in recent years. Most contributions focus on understanding the term itself and figuring out what is behind it, or arguing for or against a stronger formal competence of the European Union in this field. This article will leave out these aspects and pay attention to (undefined and legally non-binding) conceptual elements of territorial cohesion, focusing on the challenge of linking it within spatial policies and organising the relations. Therefore, the theoretical approach of Cultural Theory and its concept of clumsy solution are applied to overcome the dilemma of typical dichotomies by adding a third and a fourth (but not a fifth) perspective. In doing so, normative contradictions between different rational approaches can be revealed, explained and approached with the concept of ‘clumsy solutions’. This contribution aims at discussing how this theoretical approach helps us explain and frame a coalition between the Territorial Cohesion Policy and spatial policies. This approach contributes to finding the best way of linking and organising policies, although the solution might be clumsy according to the different rationalities involved.
Resumo:
Aims: To evaluate the implications of an Absorb bioresorbable vascular scaffold (Absorb BVS) on the morphology of the superficial plaques. Methods and results: Forty-six patients who underwent Absorb BVS implantation and 20 patients implanted with bare metal stents (BMS) who had serial optical coherence tomographic examination at baseline and follow-up were included in this analysis. The thin-capped fibroatheromas (TCFA) were identified in the device implantation regions and in the adjacent native coronary segments. Within all regions, circumferential locations of TCFA and calcific tissues were identified, and the neointimal thickness was measured at follow-up. At six to 12-month follow-up, only 8% of the TCFA detected at baseline were still present in the Absorb BVS and 27% in the BMS implantation segment (p=0.231). Sixty percent of the TCFA in native segments did not change their phenotype at follow-up. At short-term follow-up, significant reduction in the lumen area of the BMS was noted, which was higher compared to that reported in the Absorb BVS group (-2.11±1.97 mm2 vs. -1.34±0.99 mm2, p=0.026). In Absorb BVS, neointima tissue continued to develop at midterm follow-up (2.17±0.48 mm2 vs. 1.38±0.52 mm2, p<0.0001) and covered the underlying tissues without compromising the luminal dimensions (5.93±1.49 mm2 vs. 6.14±1.49 mm2, p=0.571) as it was accommodated by the expanded scaffold (8.28±1.74 mm2 vs. 7.67±1.28 mm2, p<0.0001). Conclusions: Neointimal tissue develops following either Absorb BVS or BMS implantation and shields lipid tissues. The neointimal response in the BMS causes a higher reduction of luminal dimensions compared to the Absorb BVS. Thus, Absorb BVS may have a value in the invasive re-capping of high-risk plaques.
Resumo:
In practical forensic casework, backspatter recovered from shooters' hands can be an indicator of self-inflicted gunshot wounds to the head. In such cases, backspatter retrieved from inside the barrel indicates that the weapon found at the death scene was involved in causing the injury to the head. However, systematic research on the aspects conditioning presence, amount and specific patterns of backspatter is lacking so far. Herein, a new concept of backspatter investigation is presented, comprising staining technique, weapon and target medium: the 'triple contrast method' was developed, tested and is introduced for experimental backspatter analysis. First, mixtures of various proportions of acrylic paint for optical detection, barium sulphate for radiocontrast imaging in computed tomography and fresh human blood for PCR-based DNA profiling were generated (triple mixture) and tested for DNA quantification and short tandem repeat (STR) typing success. All tested mixtures yielded sufficient DNA that produced full STR profiles suitable for forensic identification. Then, for backspatter analysis, sealed foil bags containing the triple mixture were attached to plastic bottles filled with 10 % ballistic gelatine and covered by a 2-3-mm layer of silicone. To simulate backspatter, close contact shots were fired at these models. Endoscopy of the barrel inside revealed coloured backspatter containing typable DNA and radiographic imaging showed a contrasted bullet path in the gelatine. Cross sections of the gelatine core exhibited cracks and fissures stained by the acrylic paint facilitating wound ballistic analysis.
Resumo:
The brain is a complex neural network with a hierarchical organization and the mapping of its elements and connections is an important step towards the understanding of its function. Recent developments in diffusion-weighted imaging have provided the opportunity to reconstruct the whole-brain structural network in-vivo at a large scale level and to study the brain structural substrate in a framework that is close to the current understanding of brain function. However, methods to construct the connectome are still under development and they should be carefully evaluated. To this end, the first two studies included in my thesis aimed at improving the analytical tools specific to the methodology of brain structural networks. The first of these papers assessed the repeatability of the most common global and local network metrics used in literature to characterize the connectome, while in the second paper the validity of further metrics based on the concept of communicability was evaluated. Communicability is a broader measure of connectivity which accounts also for parallel and indirect connections. These additional paths may be important for reorganizational mechanisms in the presence of lesions as well as to enhance integration in the network. These studies showed good to excellent repeatability of global network metrics when the same methodological pipeline was applied, but more variability was detected when considering local network metrics or when using different thresholding strategies. In addition, communicability metrics have been found to add some insight into the integration properties of the network by detecting subsets of nodes that were highly interconnected or vulnerable to lesions. The other two studies used methods based on diffusion-weighted imaging to obtain knowledge concerning the relationship between functional and structural connectivity and about the etiology of schizophrenia. The third study integrated functional oscillations measured using electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) as well as diffusion-weighted imaging data. The multimodal approach that was applied revealed a positive relationship between individual fluctuations of the EEG alpha-frequency and diffusion properties of specific connections of two resting-state networks. Finally, in the fourth study diffusion-weighted imaging was used to probe for a relationship between the underlying white matter tissue structure and season of birth in schizophrenia patients. The results are in line with the neurodevelopmental hypothesis of early pathological mechanisms as the origin of schizophrenia. The different analytical approaches selected in these studies also provide arguments for discussion of the current limitations in the analysis of brain structural networks. To sum up, the first studies presented in this thesis illustrated the potential of brain structural network analysis to provide useful information on features of brain functional segregation and integration using reliable network metrics. In the other two studies alternative approaches were presented. The common discussion of the four studies enabled us to highlight the benefits and possibilities for the analysis of the connectome as well as some current limitations.
Resumo:
Software developers are often unsure of the exact name of the method they need to use to invoke the desired behavior in a given context. This results in a process of searching for the correct method name in documentation, which can be lengthy and distracting to the developer. We can decrease the method search time by enhancing the documentation of a class with the most frequently used methods. Usage frequency data for methods is gathered by analyzing other projects from the same ecosystem - written in the same language and sharing dependencies. We implemented a proof of concept of the approach for Pharo Smalltalk and Java. In Pharo Smalltalk, methods are commonly searched for using a code browser tool called "Nautilus", and in Java using a web browser displaying HTML based documentation - Javadoc. We developed plugins for both browsers and gathered method usage data from open source projects, in order to increase developer productivity by reducing method search time. A small initial evaluation has been conducted showing promising results in improving developer productivity.
Resumo:
This paper evaluates the impact of alternative city boundary definitions on economic performance. First we discuss the theoretical background and motivate the empirical work. Then we present the methodological concept of the sensitivity analysis, which will be applied to a variety of data of Zurich and Bern (the financial and the administrative centres of Switzerland) in order to see how the values of different indicators vary depending on the definition adopted. Finally we will show whether the empirical patterns found are statistically significant. The analysis shows, that the delimitation of a city or city region indeed matters.
Resumo:
The nematode Caenorhabditis elegans is a well-known model organism used to investigate fundamental questions in biology. Motility assays of this small roundworm are designed to study the relationships between genes and behavior. Commonly, motility analysis is used to classify nematode movements and characterize them quantitatively. Over the past years, C. elegans' motility has been studied across a wide range of environments, including crawling on substrates, swimming in fluids, and locomoting through microfluidic substrates. However, each environment often requires customized image processing tools relying on heuristic parameter tuning. In the present study, we propose a novel Multi-Environment Model Estimation (MEME) framework for automated image segmentation that is versatile across various environments. The MEME platform is constructed around the concept of Mixture of Gaussian (MOG) models, where statistical models for both the background environment and the nematode appearance are explicitly learned and used to accurately segment a target nematode. Our method is designed to simplify the burden often imposed on users; here, only a single image which includes a nematode in its environment must be provided for model learning. In addition, our platform enables the extraction of nematode ‘skeletons’ for straightforward motility quantification. We test our algorithm on various locomotive environments and compare performances with an intensity-based thresholding method. Overall, MEME outperforms the threshold-based approach for the overwhelming majority of cases examined. Ultimately, MEME provides researchers with an attractive platform for C. elegans' segmentation and ‘skeletonizing’ across a wide range of motility assays.
Resumo:
The present dissertation focuses on trust and comprises three empirical essays on the concept itself and its foundations. All three essays investigate trust as an expectation and rely on selfreport measures of trust. Whereas the first two chapters investigate social trust, the third chapter investigates political trust. Essentially, there are three related important debates to which the following chapters contribute. A first debate discusses problems with current selfreport measures. Scholars recently started to question whether standard trust questions really measure the same across countries and languages. Chapter 1 engages in this debate. Using data from Switzerland it studies whether different trust questions measure the same latent trust constructs across individuals belonging to three different culturallinguistic regions. The second debate concerns the socalled forms or dimensions of trust. Recently, scholars started investigating whether trust is a onedimensional construct, i.e. whether an individual's trust judgment differs for categories of trustees such as strangers, neighbors, family members and friends or not. Relying on confirmatory factor analysis Chapter 2 investigates whether individuals really do make a difference between different trustee categories and to what extent these judgments can be summarized into higherorder latent trust constructs. The third debate is concerned with causes of differences in trust across humans. Chapter 3 focuses on the role of laterlife experiences, more precisely victimization experiences and investigates their causal relationship with generalized social trust. Chapter 4 focuses on the impact of direct democratic institutions on the trust relationship between citizens and political authorities.
Resumo:
The paper revives a theoretical definition of party coherence as being composed of two basic elements, cohesion and factionalism, to propose and apply a novel empirical measure based on spin physics. The simultaneous analysis of both components using a single measurement concept is applied to data representing the political beliefs of candidates in the Swiss general elections of 2003 and 2007, proposing a connection between the coherence of the beliefs party members hold and the assessment of parties being at risk of splitting. We also compare our measure with established polarization measures and demonstrate its advantage with respect to multi-dimensional data that lack clear structure. Furthermore, we outline how our analysis supports the distinction between bottom-up and top-down mechanisms of party splitting. In this way, we are able to turn the intuition of coherence into a defined quantitative concept that, additionally, offers a methodological basis for comparative research of party coherence. Our work serves as an example of how a complex systems approach allows to get a new perspective on a long-standing issue in political science.
Resumo:
Metallocene dichlorides constitute a remarkable class of antineoplastic agents that are highly effective against several cancer cell lines. They were shown to accumulate in the DNA-rich region, which suggests DNA as the primary target. These compounds exhibit two cyclopentadienyl ligands and two labile halide ligands, resulting in a bent sandwich structure. The cis-dihalide motif is structurally related to the cis-chloro configuration of cisplatin and similar modes of action can thus be assumed. Cisplatin binds to two neighboring guanine nucleobases in DNA and consequently, distorts the double-helix, thereby inhibiting DNA replication and transcription. Platinum is classified as a soft Lewis acid and binds preferentially to the nitrogen atoms within the nucleobases. The metallocene dichlorides investigated in this study comprise the metal centers Ti, V, Nb, Mo, Hf, and W, which are classified as hard or intermediate Lewis acids, and thus, favor binding to the phosphate oxygen. Although several studies reported adduct formation of metallocene dichlorides with nucleic acids, substantial information about the adduct composition, the binding pattern, and the nucleobase selectivity has not been provided yet. ESI-MS analyses gave evidence for the formation of metallocene adducts (M = Ti, V, Mo, and W) with single-stranded DNA homologues at pH 7. No adducts were formed with Nb and Hf at neutral pH, albeit adducts with Nb were observed at a low pH. MS2 data revealed considerable differences of the adduct compositions. The product ion spectra of DNA adducts with hard Lewis acids (Ti, V) gave evidence for the loss of metallocene ligands and only moderate backbone fragmentation was observed. By contrast, adducts with intermediate Lewis acids (Mo, W) retained the hydroxy ligands. Preliminary results are in good agreement with the Pearson concept and DFT calculations. Since the metallodrugs were not lost upon CID, the nucleobase selectivity, stoichiometry, and binding patterns can be elucidated by means of tandem mass spectrometry.
Resumo:
The ecosystem services concept (ES) is becoming a cornerstone of contemporary sustainability thought. Challenges with this concept and its applications are well documented, but have not yet been systematically assessed alongside strengths and external factors that influence uptake. Such an assessment could form the basis for improving ES thinking, further embedding it into environmental decisions and management. The Young Ecosystem Services Specialists (YESS) completed a Strengths–Weaknesses–Opportunities–Threats (SWOT) analysis of ES through YESS member surveys. Strengths include the approach being interdisciplinary, and a useful communication tool. Weaknesses include an incomplete scientific basis, frameworks being inconsistently applied, and accounting for nature's intrinsic value. Opportunities include alignment with existing policies and established methodologies, and increasing environmental awareness. Threats include resistance to change, and difficulty with interdisciplinary collaboration. Consideration of SWOT themes suggested five strategic areas for developing and implementing ES. The ES concept could improve decision-making related to natural resource use, and interpretation of the complexities of human-nature interactions. It is contradictory – valued as a simple means of communicating the importance of conservation, whilst also considered an oversimplification characterised by ambiguous language. Nonetheless, given sufficient funding and political will, the ES framework could facilitate interdisciplinary research, ensuring decision-making that supports sustainable development.
Resumo:
BACKGROUND This first-in-human proof-of-concept study aimed to check whether safety and preclinical results obtained by intratumoral administration of BQ788, an endothelin receptor B (EDNRB) antagonist, can be repeated in human melanoma patients. METHODS Three patients received a single intralesional BQ788 application of 3 mg. After 3-7 days, the lesions were measured and removed for analysis. The administered dose was increased to a cumulative dosage of 8 mg in patient 4 (4 × 2.0 mg, days 0-3; lesion removed on day 4) and to 10 mg in patient 5 (3 × 3.3 mg, days 0, 3, and 10; lesion removed after 14 days). Control lesions were simultaneously treated with phosphate-buffered saline (PBS). All samples were processed and analyzed without knowledge of the clinical findings. RESULTS No statistical evaluation was possible because of the number of patients (n = 5) and the variability in the mode of administration. No adverse events were observed, regardless of administered dose. All observations were in accordance with results obtained in preclinical studies. Accordingly, no difference in degree of tumor necrosis was detected between BQ788- and PBS-treated samples. In addition, both EDNRB and Ki67 showed decreased expression in patients 2 and 5 and, to a lesser extent, in patient 1. Similarly, decreased expression of EDNRB mRNA in patients 2 and 5 and of BCL2A1 and/or PARP3 in patients 2, 3, and 5 was found. Importantly, semiquantitatively scored immunohistochemistry for CD31 and CD3 revealed more blood vessels and lymphocytes, respectively, in BQ788-treated tumors of patients 2 and 4. Also, in all patients, we observed inverse correlation in expression levels between EDNRB and HIF1A. Finally, in patient 5 (the only patient treated for longer than 1 week), we observed inhibition in lesion growth, as shown by size measurement. CONCLUSION The intralesional applications of BQ788 were well tolerated and showed signs of directly and indirectly reducing the viability of melanoma cells.
Resumo:
Do siblings of centenarians tend to have longer life spans? To answer this question, life spans of 184 siblings for 42 centenarians have been evaluated. Two important questions have been addressed in analyzing the sibling data. First, a standard needs to be established, to which the life spans of 184 siblings are compared. In this report, an external reference population is constructed from the U.S. life tables. Its estimated mortality rates are treated as baseline hazards from which the relative mortality of the siblings are estimated. Second, the standard survival models which assume independent observations are invalid when correlation within family exists, underestimating the true variance. Methods that allow correlations are illustrated by three different methods. First, the cumulative relative excess mortality between siblings and their comparison group is calculated and used as an effective graphic tool, along with the Product Limit estimator of the survival function. The variance estimator of the cumulative relative excess mortality is adjusted for the potential within family correlation using Taylor linearization approach. Second, approaches that adjust for the inflated variance are examined. They are adjusted one-sample log-rank test using design effect originally proposed by Rao and Scott in the correlated binomial or Poisson distribution setting and the robust variance estimator derived from the log-likelihood function of a multiplicative model. Nether of these two approaches provide correlation estimate within families, but the comparison with the comparison with the standard remains valid under dependence. Last, using the frailty model concept, the multiplicative model, where the baseline hazards are known, is extended by adding a random frailty term that is based on the positive stable or the gamma distribution. Comparisons between the two frailty distributions are performed by simulation. Based on the results from various approaches, it is concluded that the siblings of centenarians had significant lower mortality rates as compared to their cohorts. The frailty models also indicate significant correlations between the life spans of the siblings. ^
Resumo:
Dielectrophoresis (DEP) has been used to manipulate cells in low-conductivity suspending media using AC electrical fields generated on micro-fabricated electrode arrays. This has created the possibility of performing automatically on a micro-scale more sophisticated cell processing than that currently requiring substantial laboratory equipment, reagent volumes, time, and human intervention. In this research the manipulation of aqueous droplets in an immiscible, low-permittivity suspending medium is described to complement previous work on dielectrophoretic cell manipulation. Such droplets can be used as carriers not only for air- and water-borne samples, contaminants, chemical reagents, viral and gene products, and cells, but also the reagents to process and characterize these samples. A long-term goal of this area of research is to perform chemical and biological assays on automated, micro-scaled devices at or near the point-of-care, which will increase the availability of modern medicine to people who do not have ready access to large medical institutions and decrease the cost and delays associated with that lack of access. In this research I present proofs-of-concept for droplet manipulation and droplet-based biochemical analysis using dielectrophoresis as the motive force. Proofs-of-concept developed for the first time in this research include: (1) showing droplet movement on a two-dimensional array of electrodes, (2) achieving controlled dielectric droplet injection, (3) fusing and reacting droplets, and (4) demonstrating a protein fluorescence assay using micro-droplets. ^
Resumo:
This paper uses Data Envelopment Analysis to measure labor use efficiency of individual branches of a large public sector bank with several thousand branches across India. We find considerable variation in the average levels of efficiency across the four metropolitan regions considered in this study. In this context, we introduce the concept of area or spatial efficiency for each region relative to the nation as a whole. Our findings suggest that the policies, procedures, and incentives handed down from the corporate level cannot fully neutralize the influence of the local work culture in the different regions. Most of the potential reduction in labor cost appears to be coming from possible downsizing the clerical and subordinate staff. Our analysis identifies branches that operate at very low levels of efficiency and may be gainfully merged with other branches wherever possible.