842 resultados para hidden borrowing


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The brain integrates multiple sensory inputs, including somatosensory and visual inputs, to produce a representation of the body. Spinal cord injury (SCI) interrupts the communication between brain and body and the effects of this deafferentation on body representation are poorly understood. We investigated whether the relative weight of somatosensory and visual frames of reference for body representation is altered in individuals with incomplete or complete SCI (affecting lower limbs' somatosensation), with respect to controls. To study the influence of afferent somatosensory information on body representation, participants verbally judged the laterality of rotated images of feet, hands, and whole-bodies (mental rotation task) in two different postures (participants' body parts were hidden from view). We found that (i) complete SCI disrupts the influence of postural changes on the representation of the deafferented body parts (feet, but not hands) and (ii) regardless of posture, whole-body representation progressively deteriorates proportionally to SCI completeness. These results demonstrate that the cortical representation of the body is dynamic, responsive, and adaptable to contingent conditions, in that the role of somatosensation is altered and partially compensated with a change in the relative weight of somatosensory versus visual bodily representations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wastewater-based epidemiology consists in acquiring relevant information about the lifestyle and health status of the population through the analysis of wastewater samples collected at the influent of a wastewater treatment plant. Whilst being a very young discipline, it has experienced an astonishing development since its firs application in 2005. The possibility to gather community-wide information about drug use has been among the major field of application. The wide resonance of the first results sparked the interest of scientists from various disciplines. Since then, research has broadened in innumerable directions. Although being praised as a revolutionary approach, there was a need to critically assess its added value, with regard to the existing indicators used to monitor illicit drug use. The main, and explicit, objective of this research was to evaluate the added value of wastewater-based epidemiology with regards to two particular, although interconnected, dimensions of illicit drug use. The first is related to trying to understand the added value of the discipline from an epidemiological, or societal, perspective. In other terms, to evaluate if and how it completes our current vision about the extent of illicit drug use at the population level, and if it can guide the planning of future prevention measures and drug policies. The second dimension is the criminal one, with a particular focus on the networks which develop around the large demand in illicit drugs. The goal here was to assess if wastewater-based epidemiology, combined to indicators stemming from the epidemiological dimension, could provide additional clues about the structure of drug distribution networks and the size of their market. This research had also an implicit objective, which focused on initiating the path of wastewater- based epidemiology at the Ecole des Sciences Criminelles of the University of Lausanne. This consisted in gathering the necessary knowledge about the collection, preparation, and analysis of wastewater samples and, most importantly, to understand how to interpret the acquired data and produce useful information. In the first phase of this research, it was possible to determine that ammonium loads, measured directly in the wastewater stream, could be used to monitor the dynamics of the population served by the wastewater treatment plant. Furthermore, it was shown that on the long term, the population did not have a substantial impact on consumption patterns measured through wastewater analysis. Focussing on methadone, for which precise prescription data was available, it was possible to show that reliable consumption estimates could be obtained via wastewater analysis. This allowed to validate the selected sampling strategy, which was then used to monitor the consumption of heroin, through the measurement of morphine. The latter, in combination to prescription and sales data, provided estimates of heroin consumption in line with other indicators. These results, combined to epidemiological data, highlighted the good correspondence between measurements and expectations and, furthermore, suggested that the dark figure of heroin users evading harm-reduction programs, which would thus not be measured by conventional indicators, is likely limited. In the third part, which consisted in a collaborative study aiming at extensively investigating geographical differences in drug use, wastewater analysis was shown to be a useful complement to existing indicators. In particular for stigmatised drugs, such as cocaine and heroin, it allowed to decipher the complex picture derived from surveys and crime statistics. Globally, it provided relevant information to better understand the drug market, both from an epidemiological and repressive perspective. The fourth part focused on cannabis and on the potential of combining wastewater and survey data to overcome some of their respective limitations. Using a hierarchical inference model, it was possible to refine current estimates of cannabis prevalence in the metropolitan area of Lausanne. Wastewater results suggested that the actual prevalence is substantially higher compared to existing figures, thus supporting the common belief that surveys tend to underestimate cannabis use. Whilst being affected by several biases, the information collected through surveys allowed to overcome some of the limitations linked to the analysis of cannabis markers in wastewater (i.e., stability and limited excretion data). These findings highlighted the importance and utility of combining wastewater-based epidemiology to existing indicators about drug use. Similarly, the fifth part of the research was centred on assessing the potential uses of wastewater-based epidemiology from a law enforcement perspective. Through three concrete examples, it was shown that results from wastewater analysis can be used to produce highly relevant intelligence, allowing drug enforcement to assess the structure and operations of drug distribution networks and, ultimately, guide their decisions at the tactical and/or operational level. Finally, the potential to implement wastewater-based epidemiology to monitor the use of harmful, prohibited and counterfeit pharmaceuticals was illustrated through the analysis of sibutramine, and its urinary metabolite, in wastewater samples. The results of this research have highlighted that wastewater-based epidemiology is a useful and powerful approach with numerous scopes. Faced with the complexity of measuring a hidden phenomenon like illicit drug use, it is a major addition to the panoply of existing indicators. -- L'épidémiologie basée sur l'analyse des eaux usées (ou, selon sa définition anglaise, « wastewater-based epidemiology ») consiste en l'acquisition d'informations portant sur le mode de vie et l'état de santé d'une population via l'analyse d'échantillons d'eaux usées récoltés à l'entrée des stations d'épuration. Bien qu'il s'agisse d'une discipline récente, elle a vécu des développements importants depuis sa première mise en oeuvre en 2005, notamment dans le domaine de l'analyse des résidus de stupéfiants. Suite aux retombées médiatiques des premiers résultats de ces analyses de métabolites dans les eaux usées, de nombreux scientifiques provenant de différentes disciplines ont rejoint les rangs de cette nouvelle discipline en développant plusieurs axes de recherche distincts. Bien que reconnu pour son coté objectif et révolutionnaire, il était nécessaire d'évaluer sa valeur ajoutée en regard des indicateurs couramment utilisés pour mesurer la consommation de stupéfiants. En se focalisant sur deux dimensions spécifiques de la consommation de stupéfiants, l'objectif principal de cette recherche était focalisé sur l'évaluation de la valeur ajoutée de l'épidémiologie basée sur l'analyse des eaux usées. La première dimension abordée était celle épidémiologique ou sociétale. En d'autres termes, il s'agissait de comprendre si et comment l'analyse des eaux usées permettait de compléter la vision actuelle sur la problématique, ainsi que déterminer son utilité dans la planification des mesures préventives et des politiques en matière de stupéfiants actuelles et futures. La seconde dimension abordée était celle criminelle, en particulier, l'étude des réseaux qui se développent autour du trafic de produits stupéfiants. L'objectif était de déterminer si cette nouvelle approche combinée aux indicateurs conventionnels, fournissait de nouveaux indices quant à la structure et l'organisation des réseaux de distribution ainsi que sur les dimensions du marché. Cette recherche avait aussi un objectif implicite, développer et d'évaluer la mise en place de l'épidémiologie basée sur l'analyse des eaux usées. En particulier, il s'agissait d'acquérir les connaissances nécessaires quant à la manière de collecter, traiter et analyser des échantillons d'eaux usées, mais surtout, de comprendre comment interpréter les données afin d'en extraire les informations les plus pertinentes. Dans la première phase de cette recherche, il y pu être mis en évidence que les charges en ammonium, mesurées directement dans les eaux usées permettait de suivre la dynamique des mouvements de la population contributrice aux eaux usées de la station d'épuration de la zone étudiée. De plus, il a pu être démontré que, sur le long terme, les mouvements de la population n'avaient pas d'influence substantielle sur le pattern de consommation mesuré dans les eaux usées. En se focalisant sur la méthadone, une substance pour laquelle des données précises sur le nombre de prescriptions étaient disponibles, il a pu être démontré que des estimations exactes sur la consommation pouvaient être tirées de l'analyse des eaux usées. Ceci a permis de valider la stratégie d'échantillonnage adoptée, qui, par le bais de la morphine, a ensuite été utilisée pour suivre la consommation d'héroïne. Combinée aux données de vente et de prescription, l'analyse de la morphine a permis d'obtenir des estimations sur la consommation d'héroïne en accord avec des indicateurs conventionnels. Ces résultats, combinés aux données épidémiologiques ont permis de montrer une bonne adéquation entre les projections des deux approches et ainsi démontrer que le chiffre noir des consommateurs qui échappent aux mesures de réduction de risque, et qui ne seraient donc pas mesurés par ces indicateurs, est vraisemblablement limité. La troisième partie du travail a été réalisée dans le cadre d'une étude collaborative qui avait pour but d'investiguer la valeur ajoutée de l'analyse des eaux usées à mettre en évidence des différences géographiques dans la consommation de stupéfiants. En particulier pour des substances stigmatisées, telles la cocaïne et l'héroïne, l'approche a permis d'objectiver et de préciser la vision obtenue avec les indicateurs traditionnels du type sondages ou les statistiques policières. Globalement, l'analyse des eaux usées s'est montrée être un outil très utile pour mieux comprendre le marché des stupéfiants, à la fois sous l'angle épidémiologique et répressif. La quatrième partie du travail était focalisée sur la problématique du cannabis ainsi que sur le potentiel de combiner l'analyse des eaux usées aux données de sondage afin de surmonter, en partie, leurs limitations. En utilisant un modèle d'inférence hiérarchique, il a été possible d'affiner les actuelles estimations sur la prévalence de l'utilisation de cannabis dans la zone métropolitaine de la ville de Lausanne. Les résultats ont démontré que celle-ci est plus haute que ce que l'on s'attendait, confirmant ainsi l'hypothèse que les sondages ont tendance à sous-estimer la consommation de cannabis. Bien que biaisés, les données récoltées par les sondages ont permis de surmonter certaines des limitations liées à l'analyse des marqueurs du cannabis dans les eaux usées (i.e., stabilité et manque de données sur l'excrétion). Ces résultats mettent en évidence l'importance et l'utilité de combiner les résultats de l'analyse des eaux usées aux indicateurs existants. De la même façon, la cinquième partie du travail était centrée sur l'apport de l'analyse des eaux usées du point de vue de la police. Au travers de trois exemples, l'utilisation de l'indicateur pour produire du renseignement concernant la structure et les activités des réseaux de distribution de stupéfiants, ainsi que pour guider les choix stratégiques et opérationnels de la police, a été mise en évidence. Dans la dernière partie, la possibilité d'utiliser cette approche pour suivre la consommation de produits pharmaceutiques dangereux, interdits ou contrefaits, a été démontrée par l'analyse dans les eaux usées de la sibutramine et ses métabolites. Les résultats de cette recherche ont mis en évidence que l'épidémiologie par l'analyse des eaux usées est une approche pertinente et puissante, ayant de nombreux domaines d'application. Face à la complexité de mesurer un phénomène caché comme la consommation de stupéfiants, la valeur ajoutée de cette approche a ainsi pu être démontrée.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract Objective: To evaluate three-dimensional translational setup errors and residual errors in image-guided radiosurgery, comparing frameless and frame-based techniques, using an anthropomorphic phantom. Materials and Methods: We initially used specific phantoms for the calibration and quality control of the image-guided system. For the hidden target test, we used an Alderson Radiation Therapy (ART)-210 anthropomorphic head phantom, into which we inserted four 5mm metal balls to simulate target treatment volumes. Computed tomography images were the taken with the head phantom properly positioned for frameless and frame-based radiosurgery. Results: For the frameless technique, the mean error magnitude was 0.22 ± 0.04 mm for setup errors and 0.14 ± 0.02 mm for residual errors, the combined uncertainty being 0.28 mm and 0.16 mm, respectively. For the frame-based technique, the mean error magnitude was 0.73 ± 0.14 mm for setup errors and 0.31 ± 0.04 mm for residual errors, the combined uncertainty being 1.15 mm and 0.63 mm, respectively. Conclusion: The mean values, standard deviations, and combined uncertainties showed no evidence of a significant differences between the two techniques when the head phantom ART-210 was used.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the rubber hand illusion tactile stimulation seen on a rubber hand, that is synchronous with tactile stimulation felt on the hidden real hand, can lead to an illusion of ownership over the rubber hand. This illusion has been shown to produce a temperature decrease in the hidden hand, suggesting that such illusory ownership produces disownership of the real hand. Here we apply immersive virtual reality (VR) to experimentally investigate this with respect to sensitivity to temperature change. Forty participants experienced immersion in a VR with a virtual body (VB) seen from a first person perspective. For half the participants the VB was consistent in posture and movement with their own body, and in the other half there was inconsistency. Temperature sensitivity on the palm of the hand was measured before and during the virtual experience. The results show that temperature sensitivity decreased in the consistent compared to the inconsistent condition. Moreover, the change in sensitivity was significantly correlated with the subjective illusion of virtual arm ownership but modulated by the illusion of ownership over the full virtual body. This suggests that a full body ownership illusion results in a unification of the virtual and real bodies into one overall entity - with proprioception and tactile sensations on the real body integrated with the visual presence of the virtual body. The results are interpreted in the framework of a"body matrix" recently introduced into the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Many species contain evolutionarily distinct groups that are genetically highly differentiated but morphologically difficult to distinguish (i.e., cryptic species). The presence of cryptic species poses significant challenges for the accurate assessment of biodiversity and, if unrecognized, may lead to erroneous inferences in many fields of biological research and conservation. RESULTS: We tested for cryptic genetic variation within the broadly distributed alpine mayfly Baetis alpinus across several major European drainages in the central Alps. Bayesian clustering and multivariate analyses of nuclear microsatellite loci, combined with phylogenetic analyses of mitochondrial DNA, were used to assess population genetic structure and diversity. We identified two genetically highly differentiated lineages (A and B) that had no obvious differences in regional distribution patterns, and occurred in local sympatry. Furthermore, the two lineages differed in relative abundance, overall levels of genetic diversity as well as patterns of population structure: lineage A was abundant, widely distributed and had a higher level of genetic variation, whereas lineage B was less abundant, more prevalent in spring-fed tributaries than glacier-fed streams and restricted to high elevations. Subsequent morphological analyses revealed that traits previously acknowledged as intraspecific variation of B. alpinus in fact segregated these two lineages. CONCLUSIONS: Taken together, our findings indicate that even common and apparently ecologically well-studied species may consist of reproductively isolated units, with distinct evolutionary histories and likely different ecology and evolutionary potential. These findings emphasize the need to investigate hidden diversity even in well-known species to allow for appropriate assessment of biological diversity and conservation measures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Peer-reviewed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Construction of multiple sequence alignments is a fundamental task in Bioinformatics. Multiple sequence alignments are used as a prerequisite in many Bioinformatics methods, and subsequently the quality of such methods can be critically dependent on the quality of the alignment. However, automatic construction of a multiple sequence alignment for a set of remotely related sequences does not always provide biologically relevant alignments.Therefore, there is a need for an objective approach for evaluating the quality of automatically aligned sequences. The profile hidden Markov model is a powerful approach in comparative genomics. In the profile hidden Markov model, the symbol probabilities are estimated at each conserved alignment position. This can increase the dimension of parameter space and cause an overfitting problem. These two research problems are both related to conservation. We have developed statistical measures for quantifying the conservation of multiple sequence alignments. Two types of methods are considered, those identifying conserved residues in an alignment position, and those calculating positional conservation scores. The positional conservation score was exploited in a statistical prediction model for assessing the quality of multiple sequence alignments. The residue conservation score was used as part of the emission probability estimation method proposed for profile hidden Markov models. The results of the predicted alignment quality score highly correlated with the correct alignment quality scores, indicating that our method is reliable for assessing the quality of any multiple sequence alignment. The comparison of the emission probability estimation method with the maximum likelihood method showed that the number of estimated parameters in the model was dramatically decreased, while the same level of accuracy was maintained. To conclude, we have shown that conservation can be successfully used in the statistical model for alignment quality assessment and in the estimation of emission probabilities in the profile hidden Markov models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Academic Career Paths. The early career phases of generalists in the fields of humanities, social science and education in the 1980’s and 1990’s This doctoral thesis analyses how generalist graduates of master’s degree have attached to the labour market in two different time periods, and how their career paths in the first eight years following graduation have shaped up. The thesis also analyses the channels of employment through which the generalists have got employed in their early career path. By generalists I am referring to graduates of studies in academic fields which have not qualified the person in a specific profession but rather offered a more general readiness for working life. I address two groups of generalist master’s degree graduates of The University of Turku; one including graduates of the year 1985 and the other consisting of graduates of 1995. All subjects have graduated in the field of humanities, social studies or education. 71 respondents from the group of 1985 and 80 respondents from the group of 1995 answered a survey, which provided the data for the thesis. I interpret the data through the theoretical approaches of changing working life, model of normal employment, transitional labour markets, linear life path, overlapping life courses, hidden labour market and social capital. The conclusion of the thesis is that societal era is connected with employment and career paths of academic generalists. Between the two groups there were differences especially in attachment to labor market, in forms of employment (permanent full-time job vs. temporary job) and in employment channels. Compared to the situation the 1985 group had been in after their graduation, the 1995 group - after getting their degree - became more often unemployed and/or employed in duties below their level of education. Their mobility was also greater and their contracts were often temporary, whereas the graduates of 1985 had been employed in more permanent positions. I demonstrate that the career paths of generalists can be categorized in five career types: steady state, transitory, linear, unsteady and diverging career. Graduates of 1985 have been treading on more stable paths than the latter group. The channels of employment they used were roughly equally divided between formal (e.g. newspaper advertisement and employment office) and informal (e.g. personal contacts and unprompted search for work) channels, whereas amongst the 1995 group employment happened through more varied channels and mostly through informal channels. Regardless of their year of graduation the generalists’ careers had begun to evolve already while they were still in the university and had started working at the same time. The thesis displays how the model of normal employment has weakened and career paths have become unsteady as a consequence of temporary positions. What is also evident in employment when turning from the 1980’s towards the succeeding decade is the rise of significance of the hidden labour market and social capital.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software faults are expensive and cause serious damage, particularly if discovered late or not at all. Some software faults tend to be hidden. One goal of the thesis is to figure out the status quo in the field of software fault elimination since there are no recent surveys of the whole area. Basis for a structural framework is proposed for this unstructured field, paying attention to compatibility and how to find studies. Bug elimination means are surveyed, including bug knowhow, defect prevention and prediction, analysis, testing, and fault tolerance. The most common research issues for each area are identified and discussed, along with issues that do not get enough attention. Recommendations are presented for software developers, researchers, and teachers. Only the main lines of research are figured out. The main emphasis is on technical aspects. The survey was done by performing searches in IEEE, ACM, Elsevier, and Inspect databases. In addition, a systematic search was done for a few well-known related journals from recent time intervals. Some other journals, some conference proceedings and a few books, reports, and Internet articles have been investigated, too. The following problems were found and solutions for them discussed. Quality assurance is testing only is a common misunderstanding, and many checks are done and some methods applied only in the late testing phase. Many types of static review are almost forgotten even though they reveal faults that are hard to be detected by other means. Other forgotten areas are knowledge of bugs, knowing continuously repeated bugs, and lightweight means to increase reliability. Compatibility between studies is not always good, which also makes documents harder to understand. Some means, methods, and problems are considered method- or domain-specific when they are not. The field lacks cross-field research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tietotekniikan osaston osastokirjaston nykyinen varausjärjestelmä toimii paperilapuin merkittävin varauksin. Kirjojen lainaus- ja saatavuustilanne halutaan selkeämmäksi, sekä lainausprosessi helpommaksi. Työssä suunnitellaan varausjärjestelmä vanhan kirjojen hakuun jo olemassa olevan järjestelmän pohjalta, kaikki ohjelmakoodi kirjoitetaan kuitenkin uusiksi, jotta järjestelmä olisi yhtenäinen. Uusina toimintoina lisätään kirjojen varaus ja lainaus, sekä admin-käyttäjälle mahdollisuus tarkastella lainoja ja varauksia. Työn toteutuksessa edetessä sitä testataan jatkuvasti, ennen käyttöönottoa järjestelmää testataan oikeassa käyttöympäristössään. Työn tuloksena on järjestelmä, joka mahdollistaa kirjojen varaamisen ja lainaamisen sekä selkeyttää kirjojen lainaustilannetta. Lisäksi tässä dokumentissa selvitetään lyhyesti järjestelmän jatkokehitysmahdollisuuksia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The city of Tarragona houses an important architectural heritage mainly from its past as ‘Tarraco’, capital of the Roman province of Hispania Citerior, but also from its medieval and late 19th century history. The archaeological ensemble of Tarraco was inscribed as a UNESCO World Heritage Site in 2000, but although many efforts have been devoted by archaeologists and historians to unveil and understand the history and aspect of the Roman city, many aspects remain unknown. This is largely caused by the absence of a coherent body of historiographical material, which is todays cattered across several institutions and, specially, the lack of precise and useful graphical representations of the remains and of the existing city that allows in-depth analysis and interpretations of future findings. In recent years, researchers from the Catalan Institute of Classical Archaeology (ICAC) and the Architecture School of the URV (ETSA) have teamed up to produce comprehensive, detailed graphic materials, including a new set of plans and sections of the old city, of the grandiose areas of representation of the Provincial capital, and of the hidden structures beneath the city’s surface. These have been executed with the latest technologies (fotogrammetry, laser scanning) but also with traditional methods (measurement, topography), on t op of a mixture of existing materials (hand-drafted cartography from municipal master plans) and of historical and archaeological documentation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An integrated geophysical survey was conducted in September 2007 at the Cathedral of Tarragona (Catalonia, NE Spain), to search for archaeological remains of the Roman temple dedicated to the Emperor Augustus. Many hypotheses about its location have been put forward, the most recent ones suggesting it could be inside the present cathedral. Tarragona’s Cathedral, one of the most famous churches in Spain (12th century), was built during the evolution from the Romanesque to Gothic styles. As its area is rather wide, direct digging to detect hidden structures would be expensive and also interfere with religious services. Consequently, the use of detailed non-invasive analyses was preferred. A project including Electrical resistivity tomography (ERT) and Ground probing radar (GPR) was planned for a year and conducted during a week of intensive field survey. Both ERT and GPR provided detailed information about subsoil structures. Different ERT techniques and arrays were used, ranging from standard Wenner-Schlumberger 2D sections to full 3D electrical imaging using the MYG array. Electrical resistivity data were recorded extensively, making available many thousands of apparent resistivity points to obtain a complete 3D image after full inversion. The geophysical results were clear enough to persuade the archaeologists to excavate the area. The excavation confirmed the geophysical interpretation. In conclusion, the significant buried structures revealed by geophysical methods under the cathedral were confirmed by recent archaeological digging as the basement of the impressive Roman Temple that headed the Provincial Forum of Tarraco, seat of the Concilium of Hispania Citerior Province.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research assesses the skills of upper comprehensive school pupils in history. The focus is on locating personal motives, assessing wider reasons hidden in historical sources and evaluating source reliability. The research also questions how a wide use of multiple sources affects pupils’ holistic understanding of historical phenomena. The participants were a multicultural group of pupils. The origins of their cultures can be traced to the Balkan, the Middle East, Asia and Europe. The number of native Finnish speakers and pupils speaking Finnish as their second language was almost equal. The multicultural composition provides opportunities to assess how culturally responsive learning history from sources is. The intercultural approach to learning in a multicultural setting emphasizes equality as a precondition for learning. In order to set assignments at least to some extent match with all participants only those answers were taken into account which were produced by pupils who had studied history for a similar period of time in the Finnish comprehensive school system. Due to the small number of participants (41), the study avoids wide generalizations. Nevertheless, possible cultural blueprints in pupils’ way of thinking are noted. The first test examined the skills of pupils to find motives for emigration. The results showed that for 7th graders finding reasons is not a problematic task. However, the number of reasons noticed and justifications varied. In addition, the way the pupils explained their choices was a distinguishing factor. Some pupils interpreted source material making use of previous knowledge on the issue, while other pupils based their analysis solely on the text handed and did not try to add their own knowledge. Answers were divided into three categories: historical, explanatory and stating. Historical answers combined smoothly previously learned historical knowledge to one’s own source analysis; explanatory answers often ignored a wider frame, although they were effective when explaining e.g. historical concepts. The stating answers only noticed motives from the sources and made no attempts to explain them historically. Was the first test culturally responsive? All pupils representing different cultures tackled the first source exam successfully, but there were some signs of how historical concepts are understood in a slightly different way if the pupil’s personal history has no linkage to the concepts under scrutiny. The second test focused on the history of Native Americans. The test first required pupils to recognize whether short source extracts (5) were written by Indians or Caucasians. Based on what they had already learned from North American history, the pupils did not find it hard to distinguish between the sources. The analysis of multiphase causes and consequences of the disputes between Native Americans and white Americans caused dispersion among pupils. Using two historical sources and combining historical knowledge from both of them simultaneously was cumbersome for many. The explanations of consequences can be divided into two groups: the ones emphasizing short term consequences and those placing emphasis on long term consequences. The short term approach was mainly followed by boys in every group. The girls mainly paid attention to long term consequences. The result suggests that historical knowledge in sources is at least to some extent read through role and gender lenses. The third test required pupils to explain in their own words how the three sources given differed in their account of living conditions in Nazi Germany, which turned out to be demanding for many pupils. The pupils’ stronghold was rather the assessment of source reliability and accounts why the sources approached the same events differently. All participants wrote critical and justified comments on reliability and aspects that might have affected the content of the sources. The pupils felt that the main reasons that affected source reliability were the authors’ ethnic background, nationality and profession. The assessment showed that pupils were well aware that position in a historical situation has an impact on historical accounts, but in certain cases the victim’s account was seen as a historical truth. The account of events by a historian was chosen most often as the most reliable source, but it was often justified leniently with an indication to professionalism rather than with clear ideas of how historians conduct accounts based on sources. In brief, the last source test demonstrates that pupils have a strong idea that the ethnicity or nationalism determines how people explained events of the past. It is also an implication that pupils understand how historical knowledge is interpretative. The results also imply that history can be analyzed from a neutral perspective. One’s own membership in an ethnical or religious group does not automatically mean that a person’s cultural identity excludes historical explanations if something in them contradicts with his or her identity. The second method of extracting knowledge of pupils’ historical thinking was an essay analysis. The analysis shows that an analytical account of complicated political issues, which often include a great number of complicated political concepts, leads more likely to an inconsistent structure in the written work of pupils. The material also demonstrates that pupils have a strong tendency to take a critical stance when assessing history. Historical empathy in particular is shown if history somehow has a linkage to young people, children or minorities. Some topics can also awake strong feelings, especially among pupils with emigrant background, if there is a linkage between one’s own personal history and that of the school; and occasionally a student’s historical experience or thoughts replaced school history. Using sources during history lessons at school seems to have many advantages. It enhances the reasoning skills of pupils and their skills to assess the nature of historical knowledge. Thus one of the main aims and a great benefit of source work is to encourage pupils to express their own ideas and opinions. To conclude, when assessing the skills of adolescents in history - their work with sources, comments on history, historical knowledge and finally their historical thinking - one should be cautious and avoid cut off score evaluations. One purpose of pursuing history with sources is to encourage pupils to think independently, which is a useful tool for further identity construction. The idea that pupils have the right to conduct their own interpretations of history can be partially understood as part of a wider learning process, justification to study history comes from extrinsic reasons. The intrinsic reason is history itself; in order to understand history one should have a basic understanding of history as a specific domain of knowledge. Using sources does not mean that knowing history is of secondary importance. Only a balance between knowing the contextual history, understanding basic key concepts and working with sources is a solid base to improve pupils’ historical understanding.