938 resultados para source analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Morphological descriptors are practical and essential biomarkers for diagnosis andtreatment selection for intracranial aneurysm management according to the current guidelinesin use. Nevertheless, relatively little work has been dedicated to improve the three-dimensionalquanti cation of aneurysmal morphology, automate the analysis, and hence reduce the inherentintra- and inter-observer variability of manual analysis. In this paper we propose a methodologyfor the automated isolation and morphological quanti cation of saccular intracranial aneurysmsbased on a 3D representation of the vascular anatomy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deciding whether two fingerprint marks originate from the same source requires examination and comparison of their features. Many cognitive factors play a major role in such information processing. In this paper we examined the consistency (both between- and within-experts) in the analysis of latent marks, and whether the presence of a 'target' comparison print affects this analysis. Our findings showed that the context of a comparison print affected analysis of the latent mark, possibly influencing allocation of attention, visual search, and threshold for determining a 'signal'. We also found that even without the context of the comparison print there was still a lack of consistency in analysing latent marks. Not only was this reflected by inconsistency between different experts, but the same experts at different times were inconsistent with their own analysis. However, the characterization of these inconsistencies depends on the standard and definition of what constitutes inconsistent. Furthermore, these effects were not uniform; the lack of consistency varied across fingerprints and experts. We propose solutions to mediate variability in the analysis of friction ridge skin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

User generated content shared in online communities is often described using collaborative tagging systems where users assign labels to content resources. As a result, a folksonomy emerges that relates a number of tags with the resources they label and the users that have used them. In this paper we analyze the folksonomy of Freesound, an online audio clip sharing site which contains more than two million users and 150,000 user-contributed sound samplescovering a wide variety of sounds. By following methodologies taken from similar studies, we compute some metrics that characterize the folksonomy both at the globallevel and at the tag level. In this manner, we are able to betterunderstand the behavior of the folksonomy as a whole, and also obtain some indicators that can be used as metadata for describing tags themselves. We expect that such a methodology for characterizing folksonomies can be useful to support processes such as tag recommendation or automatic annotation of online resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND STUDY AIMS: Appropriate use of colonoscopy is a key component of quality management in gastrointestinal endoscopy. In an update of a 1998 publication, the 2008 European Panel on the Appropriateness of Gastrointestinal Endoscopy (EPAGE II) defined appropriateness criteria for various colonoscopy indications. This introductory paper therefore deals with methodology, general appropriateness, and a review of colonoscopy complications. METHODS:The RAND/UCLA Appropriateness Method was used to evaluate the appropriateness of various diagnostic colonoscopy indications, with 14 multidisciplinary experts using a scale from 1 (extremely inappropriate) to 9 (extremely appropriate). Evidence reported in a comprehensive updated literature review was used for these decisions. Consolidation of the ratings into three appropriateness categories (appropriate, uncertain, inappropriate) was based on the median and the heterogeneity of the votes. The experts then met to discuss areas of disagreement in the light of existing evidence, followed by a second rating round, with a subsequent third voting round on necessity criteria, using much more stringent criteria (i. e. colonoscopy is deemed mandatory). RESULTS: Overall, 463 indications were rated, with 55 %, 16 % and 29 % of them being judged appropriate, uncertain and inappropriate, respectively. Perforation and hemorrhage rates, as reported in 39 studies, were in general < 0.1 % and < 0.3 %, respectively CONCLUSIONS: The updated EPAGE II criteria constitute an aid to clinical decision-making but should in no way replace individual judgment. Detailed panel results are freely available on the internet (www.epage.ch) and will thus constitute a reference source of information for clinicians.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The growing multilingual trend in movie production comes with a challenge for dubbing translators since they are increasingly confronted with more than one source language. The main purpose of this master’s thesis is to provide a case study on how these third languages (see CORRIUS and ZABALBEASCOA 2011) are rendered. Another aim is to put a particular focus on their textual and narrative functions and detect possible shifts that might occur in translations. By applying a theoretical model for translation analysis (CORRIUS and ZABALBEASCOA 2011), this study describes how third languages are rendered in the German, Spanish, and Italian dubbed versions of the 2009 Tarantino movie Inglourious Basterds. A broad range of solution-types are thereby revealed and prevalent restrictions of the translation process identified. The target texts are brought in context with some sociohistorical aspects of dubbing in order to detect prevalent norms of the respective cultures andto discuss the acceptability of translations (TOURY 1995). The translatability potential of even highly complex multilingual audiovisual texts is demonstrated in this study. Moreover, proposals for further studies in multilingual audiovisual translation are outlined and the potential for future investigations in this field thereby emphasised.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

General Introduction This thesis can be divided into two main parts :the first one, corresponding to the first three chapters, studies Rules of Origin (RoOs) in Preferential Trade Agreements (PTAs); the second part -the fourth chapter- is concerned with Anti-Dumping (AD) measures. Despite wide-ranging preferential access granted to developing countries by industrial ones under North-South Trade Agreements -whether reciprocal, like the Europe Agreements (EAs) or NAFTA, or not, such as the GSP, AGOA, or EBA-, it has been claimed that the benefits from improved market access keep falling short of the full potential benefits. RoOs are largely regarded as a primary cause of the under-utilization of improved market access of PTAs. RoOs are the rules that determine the eligibility of goods to preferential treatment. Their economic justification is to prevent trade deflection, i.e. to prevent non-preferred exporters from using the tariff preferences. However, they are complex, cost raising and cumbersome, and can be manipulated by organised special interest groups. As a result, RoOs can restrain trade beyond what it is needed to prevent trade deflection and hence restrict market access in a statistically significant and quantitatively large proportion. Part l In order to further our understanding of the effects of RoOs in PTAs, the first chapter, written with Pr. Olivier Cadot, Celine Carrère and Pr. Jaime de Melo, describes and evaluates the RoOs governing EU and US PTAs. It draws on utilization-rate data for Mexican exports to the US in 2001 and on similar data for ACP exports to the EU in 2002. The paper makes two contributions. First, we construct an R-index of restrictiveness of RoOs along the lines first proposed by Estevadeordal (2000) for NAFTA, modifying it and extending it for the EU's single-list (SL). This synthetic R-index is then used to compare Roos under NAFTA and PANEURO. The two main findings of the chapter are as follows. First, it shows, in the case of PANEURO, that the R-index is useful to summarize how countries are differently affected by the same set of RoOs because of their different export baskets to the EU. Second, it is shown that the Rindex is a relatively reliable statistic in the sense that, subject to caveats, after controlling for the extent of tariff preference at the tariff-line level, it accounts for differences in utilization rates at the tariff line level. Finally, together with utilization rates, the index can be used to estimate total compliance costs of RoOs. The second chapter proposes a reform of preferential Roos with the aim of making them more transparent and less discriminatory. Such a reform would make preferential blocs more "cross-compatible" and would therefore facilitate cumulation. It would also contribute to move regionalism toward more openness and hence to make it more compatible with the multilateral trading system. It focuses on NAFTA, one of the most restrictive FTAs (see Estevadeordal and Suominen 2006), and proposes a way forward that is close in spirit to what the EU Commission is considering for the PANEURO system. In a nutshell, the idea is to replace the current array of RoOs by a single instrument- Maximum Foreign Content (MFC). An MFC is a conceptually clear and transparent instrument, like a tariff. Therefore changing all instruments into an MFC would bring improved transparency pretty much like the "tariffication" of NTBs. The methodology for this exercise is as follows: In step 1, I estimate the relationship between utilization rates, tariff preferences and RoOs. In step 2, I retrieve the estimates and invert the relationship to get a simulated MFC that gives, line by line, the same utilization rate as the old array of Roos. In step 3, I calculate the trade-weighted average of the simulated MFC across all lines to get an overall equivalent of the current system and explore the possibility of setting this unique instrument at a uniform rate across lines. This would have two advantages. First, like a uniform tariff, a uniform MFC would make it difficult for lobbies to manipulate the instrument at the margin. This argument is standard in the political-economy literature and has been used time and again in support of reductions in the variance of tariffs (together with standard welfare considerations). Second, uniformity across lines is the only way to eliminate the indirect source of discrimination alluded to earlier. Only if two countries face uniform RoOs and tariff preference will they face uniform incentives irrespective of their initial export structure. The result of this exercise is striking: the average simulated MFC is 25% of good value, a very low (i.e. restrictive) level, confirming Estevadeordal and Suominen's critical assessment of NAFTA's RoOs. Adopting a uniform MFC would imply a relaxation from the benchmark level for sectors like chemicals or textiles & apparel, and a stiffening for wood products, papers and base metals. Overall, however, the changes are not drastic, suggesting perhaps only moderate resistance to change from special interests. The third chapter of the thesis considers whether Europe Agreements of the EU, with the current sets of RoOs, could be the potential model for future EU-centered PTAs. First, I have studied and coded at the six-digit level of the Harmonised System (HS) .both the old RoOs -used before 1997- and the "Single list" Roos -used since 1997. Second, using a Constant Elasticity Transformation function where CEEC exporters smoothly mix sales between the EU and the rest of the world by comparing producer prices on each market, I have estimated the trade effects of the EU RoOs. The estimates suggest that much of the market access conferred by the EAs -outside sensitive sectors- was undone by the cost-raising effects of RoOs. The chapter also contains an analysis of the evolution of the CEECs' trade with the EU from post-communism to accession. Part II The last chapter of the thesis is concerned with anti-dumping, another trade-policy instrument having the effect of reducing market access. In 1995, the Uruguay Round introduced in the Anti-Dumping Agreement (ADA) a mandatory "sunset-review" clause (Article 11.3 ADA) under which anti-dumping measures should be reviewed no later than five years from their imposition and terminated unless there was a serious risk of resumption of injurious dumping. The last chapter, written with Pr. Olivier Cadot and Pr. Jaime de Melo, uses a new database on Anti-Dumping (AD) measures worldwide to assess whether the sunset-review agreement had any effect. The question we address is whether the WTO Agreement succeeded in imposing the discipline of a five-year cycle on AD measures and, ultimately, in curbing their length. Two methods are used; count data analysis and survival analysis. First, using Poisson and Negative Binomial regressions, the count of AD measures' revocations is regressed on (inter alia) the count of "initiations" lagged five years. The analysis yields a coefficient on measures' initiations lagged five years that is larger and more precisely estimated after the agreement than before, suggesting some effect. However the coefficient estimate is nowhere near the value that would give a one-for-one relationship between initiations and revocations after five years. We also find that (i) if the agreement affected EU AD practices, the effect went the wrong way, the five-year cycle being quantitatively weaker after the agreement than before; (ii) the agreement had no visible effect on the United States except for aone-time peak in 2000, suggesting a mopping-up of old cases. Second, the survival analysis of AD measures around the world suggests a shortening of their expected lifetime after the agreement, and this shortening effect (a downward shift in the survival function postagreement) was larger and more significant for measures targeted at WTO members than for those targeted at non-members (for which WTO disciplines do not bind), suggesting that compliance was de jure. A difference-in-differences Cox regression confirms this diagnosis: controlling for the countries imposing the measures, for the investigated countries and for the products' sector, we find a larger increase in the hazard rate of AD measures covered by the Agreement than for other measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this document is to present Iowa’s Adult Literacy Benchmark Analysis Report: Program Year 2002. The report is designed to provide a supplemental analysis of the information presented in Tables 5-19 (pp. 16-37) referenced in the publication titled Iowa's Adult Basic Education Program Annual Benchmark Report: Program Year 2002. The original data source for Tables 1-7 is from Iowa’s National Reporting System (NRS) report Tables 4B and 5 and the publication titled Iowa’s Community College Basic Literacy Skills Credential Program: Program Year 2002. (See Appendix B of Iowa’s Adult Basic Education Program Annual Benchmark Report: Program Year 2002, [pp. 54-55] and Iowa’s Community College Basic Literacy Skills Credential Program Annual Report: Program Year 2002 Tables 1-2 [pp. 6-7]).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exploratory and descriptive study based on quantitative and qualitative methods that analyze the phenomenon of violence against adolescents based on gender and generational categories. The data source was reports of violence from the Curitiba Protection Network from 2010 to 2012 and semi-structured interviews with 16 sheltered adolescents. Quantitative data were analyzed using SPSS software version 20.0 and the qualitative data were subjected to content analysis. The adolescents were victims of violence in the household and outside of the family environment, as victims or viewers of violence. The violence was experienced at home, mostly toward girls, with marked overtones of gender violence. More than indicating the magnitude of the issue, this study can give information to help qualify the assistance given to victimized people and address how to face this issue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Around 11.5 * 106 m3 of rock detached from the eastern slope of the Santa Cruz valley (San Juan province, Argentina) in the first fortnight of January 2005. The rockslide?debris avalanche blocked the course, resulting in the development of a lake with maximum length of around 3.5 km. The increase in the inflow rate from 47,000?74,000 m3/d between April and October to 304,000 m3/d between late October and the first fortnight of November, accelerated the growing rate of the lake. On 12 November 2005 the dam failed, releasing 24.6 * 106 m3 of water. The resulting outburst flood caused damages mainly on infrastructure, and affected the facilities of a hydropower dam which was under construction 250 km downstream from the source area. In this work we describe causes and consequences of the natural dam formation and failure, and we dynamically model the 2005 rockslide?debris avalanche with DAN3D. Additionally, as a volume ~ 24 * 106 m3of rocks still remain unstable in the slope, we use the results of the back analysis to forecast the formation of a future natural dam. We analyzed two potential scenarios: a partial slope failure of 6.5 * 106 m3 and a worst case where all the unstable volume remaining in the slope fails. The spreading of those potential events shows that a new blockage of the Santa Cruz River is likely to occur. According to their modeled morphometry and the contributing watershed upstream the blockage area, as the one of 2005, the dams would also be unstable. This study shows the importance of back and forward analysis that can be carried out to obtain critical information for land use planning, hazards mitigation, and emergency management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyses and discusses arguments that emerge from a recent discussion about the proper assessment of the evidential value of correspondences observed between the characteristics of a crime stain and those of a sample from a suspect when (i) this latter individual is found as a result of a database search and (ii) remaining database members are excluded as potential sources (because of different analytical characteristics). Using a graphical probability approach (i.e., Bayesian networks), the paper here intends to clarify that there is no need to (i) introduce a correction factor equal to the size of the searched database (i.e., to reduce a likelihood ratio), nor to (ii) adopt a propositional level not directly related to the suspect matching the crime stain (i.e., a proposition of the kind 'some person in (outside) the database is the source of the crime stain' rather than 'the suspect (some other person) is the source of the crime stain'). The present research thus confirms existing literature on the topic that has repeatedly demonstrated that the latter two requirements (i) and (ii) should not be a cause of concern.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polybia scutellaris (White, 1841) is a social wasp of biological interest for its role as pollinator and maybe as biological control agent of sanitary and agricultural pests. This study examines the digestive tract contents of the larvae of P. scutellaris from four nests in Magdalena (Buenos Aires province, Argentina). Contents included both animal (arthropod parts) and plant (pollen, leaf and fruit epidermis) parts. The pollen content analysis showed that the wasps visited 19 different taxa of plants during the last active period of the colony before the nests had been collected. The range of sources used by P. scutellaris allows us characterizing the species as a generalist flower visitor. Wasps visited both native and exotic plants located nearby the nest. Most of the epidermal plant remains found in the larval digestive tract belonged to Malvaceae, a family not exploited by the studied colonies as pollen source.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ney is an end-blown flute which is mainly used for Makam music. Although from the beginning of 20th century a score representation based on extending the Western musicis used, because of its rich articulation repertoire, actualNey music can not be totally represented by written score.Ney is still taught and transmitted orally in Turkey. Becauseof that the performance has a distinct and importantrole in Ney music. Therefore signal analysis of ney performancesis crucial for understanding the actual music.Another important aspect which is also a part of the performanceis the articulations that performers apply. In Makam music in Turkey none of the articulations are taught evennamed by teachers. Articulations in Ney are valuable for understanding the real performance. Since articulations are not taught and their places are not marked in the score, the choice and character of the articulation is unique for eachperformer which also makes each performance unique.Our method analyzes audio files of well known Turkish Ney players. In order to obtain our analysis data, we analyzed audio files of 8 different performers vary from 1920to 2000.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate detection of subpopulation size determinations in bimodal populations remains problematic yet it represents a powerful way by which cellular heterogeneity under different environmental conditions can be compared. So far, most studies have relied on qualitative descriptions of population distribution patterns, on population-independent descriptors, or on arbitrary placement of thresholds distinguishing biological ON from OFF states. We found that all these methods fall short of accurately describing small population sizes in bimodal populations. Here we propose a simple, statistics-based method for the analysis of small subpopulation sizes for use in the free software environment R and test this method on real as well as simulated data. Four so-called population splitting methods were designed with different algorithms that can estimate subpopulation sizes from bimodal populations. All four methods proved more precise than previously used methods when analyzing subpopulation sizes of transfer competent cells arising in populations of the bacterium Pseudomonas knackmussii B13. The methods' resolving powers were further explored by bootstrapping and simulations. Two of the methods were not severely limited by the proportions of subpopulations they could estimate correctly, but the two others only allowed accurate subpopulation quantification when this amounted to less than 25% of the total population. In contrast, only one method was still sufficiently accurate with subpopulations smaller than 1% of the total population. This study proposes a number of rational approximations to quantifying small subpopulations and offers an easy-to-use protocol for their implementation in the open source statistical software environment R.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deciding whether two fingerprint marks originate from the same source requires examination and comparison of their features. Many cognitive factors play a major role in such information processing. In this paper we examined the consistency (both between- and within-experts) in the analysis of latent marks, and whether the presence of a 'target' comparison print affects this analysis. Our findings showed that the context of a comparison print affected analysis of the latent mark, possibly influencing allocation of attention, visual search, and threshold for determining a 'signal'. We also found that even without the context of the comparison print there was still a lack of consistency in analysing latent marks. Not only was this reflected by inconsistency between different experts, but the same experts at different times were inconsistent with their own analysis. However, the characterization of these inconsistencies depends on the standard and definition of what constitutes inconsistent. Furthermore, these effects were not uniform; the lack of consistency varied across fingerprints and experts. We propose solutions to mediate variability in the analysis of friction ridge skin.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: HIV-1 post-exposure prophylaxis (PEP) is frequently prescribed after exposure to source persons with an undetermined HIV serostatus. To reduce unnecessary use of PEP, we implemented a policy including active contacting of source persons and the availability of free, anonymous HIV testing ('PEP policy'). METHODS: All consultations for potential non-occupational HIV exposures i.e. outside the medical environment) were prospectively recorded. The impact of the PEP policy on PEP prescription and costs was analysed and modelled. RESULTS: Among 146 putative exposures, 47 involved a source person already known to be HIV positive and 23 had no indication for PEP. The remaining 76 exposures involved a source person of unknown HIV serostatus. Of 33 (43.4%) exposures for which the source person could be contacted and tested, PEP was avoided in 24 (72.7%), initiated and discontinued in seven (21.2%), and prescribed and completed in two (6.1%). In contrast, of 43 (56.6%) exposures for which the source person could not be tested, PEP was prescribed in 35 (81.4%), P &lt; 0.001. Upon modelling, the PEP policy allowed a 31% reduction of cost for management of exposures to source persons of unknown HIV serostatus. The policy was cost-saving for HIV prevalence of up to 70% in the source population. The availability of all the source persons for testing would have reduced cost by 64%. CONCLUSION: In the management of non-occupational HIV exposures, active contacting and free, anonymous testing of source persons proved feasible. This policy resulted in a decrease in prescription of PEP, proved to be cost-saving, and presumably helped to avoid unnecessary toxicity and psychological stress.