912 resultados para Applied linguistics. Discourse Processing. Understanding. Narratives. EJA
Resumo:
In the present work, microstructure improvement using FSP (Friction Stir Processing) is studied. In the first part of the work, the microstructure improvement of as-cast A356 is demonstrated. Some tensile tests were applied to check the increase in ductility. However, the expected results couldn’t be achieved. In the second part, the microstructure improvement of a fusion weld in 1050 aluminium alloy is presented. Hardness tests were carried out to prove the mechanical propertyimprovements. In the third and last part, the microstructure improvement of 1050 aluminium alloy is achieved. A discussion of the mechanical property improvements induced by FSP is made. The influence of tool traverse speed on microstructure and mechanical properties is also discussed. Hardness tests and recrystallization theory enabled us to find out such influence
Resumo:
THESIS ABSTRACT : Low-temperature thermochronology relies on application of radioisotopic systems whose closure temperatures are below temperatures at which the dated phases are formed. In that sense, the results are interpreted as "cooling ages" in contrast to "formation ages". Owing to the low closure-temperatures, it is possible to reconstruct exhumation and cooling paths of rocks during their residence at shallow levels of the crust, i.e. within first ~10 km of depth. Processes occurring at these shallow depths such as final exhumation, faulting and relief formation are fundamental for evolution of the mountain belts. This thesis aims at reconstructing the tectono-thermal history of the Aar massif in the Central Swiss Alps by means of zircon (U-Th)/He, apatite (U-Th)/He and apatite fission track thermochronology. The strategy involved acquisition of a large number of samples from a wide range of elevations in the deeply incised Lötschen valley and a nearby NEAT tunnel. This unique location allowed to precisely constrain timing, amount and mechanisms of exhumation of the main orographic feature of the Central Alps, evaluate the role of topography on the thermochronological record and test the impact of hydrothermal activity. Samples were collected from altitudes ranging between 650 and 3930 m and were grouped into five vertical profiles on the surface and one horizontal in the tunnel. Where possible, all three radiometric systems were applied to each sample. Zircon (U-Th)/He ages range from 5.1 to 9.4 Ma and are generally positively correlated with altitude. Age-elevation plots reveal a distinct break in slope, which translates into exhumation rate increasing from ~0.4 to ~3 km/Ma at 6 Ma. This acceleration is independently confirmed by increased cooling rates on the order of 100°C/Ma constrained on the basis of age differences between the zircon (U-Th)/He and the remaining systems. Apatite fission track data also plot on a steep age-elevation curve indicating rapid exhumation until the end of the Miocene. The 6 Ma event is interpreted as reflecting tectonically driven uplift of the Aar massif. The late Miocene timing implies that the increase of precipitation in the Pliocene did not trigger rapid exhumation in the Aar massif. The Messinian salinity crisis in the Mediterranean could not directly intensify erosion of the Aar but associated erosional output from the entire Alps may have tapered the orogenic wedge and caused reactivation of thrusting in the Aar massif. The high exhumation rates in the Messinian were followed by a decrease to ~1.3 km/Ma as evidenced by ~8 km of exhumation during last 6 Ma. The slowing of exhumation is also apparent from apatite (U-Th)1He age-elevation data in the northern part of the Lötschen valley where they plot on a ~0.5km/Ma line and range from 2.4 to 6.4 Ma However, from the apatite (U-Th)/He and fission track data from the NEAT tunnel, there is an indication of a perturbation of the record. The apatite ages are youngest under the axis of the valley, in contrast to an expected pattern where they would be youngest in the deepest sections of the tunnel due to heat advection into ridges. The valley however, developed in relatively soft schists while the ridges are built of solid granitoids. In line with hydrological observations from the tunnel, we suggest that the relatively permeable rocks under the valley floor, served as conduits of geothermal fluids that caused reheating leading to partial Helium loss and fission track annealing in apatites. In consequence, apatite ages from the lowermost samples are too young and the calculated exhumation rates may underestimate true values. This study demonstrated that high-density sampling is indispensable to provide meaningful thermochronological data in the Alpine setting. The multi-system approach allows verifying plausibility of the data and highlighting sources of perturbation. RÉSUMÉ DE THÈSE : La thermochronologie de basse température dépend de l'utilisation de systèmes radiométriques dont la température de fermeture est nettement inférieure à la température de cristallisation du minéral. Les résultats obtenus sont par conséquent interprétés comme des âges de refroidissement qui diffèrent des âges de formation obtenus par le biais d'autres systèmes de datation. Grâce aux températures de refroidissement basses, il est aisé de reconstruire les chemins de refroidissement et d'exhumation des roches lors de leur résidence dans la croute superficielle (jusqu'à 10 km). Les processus qui entrent en jeu à ces faibles profondeurs tels que l'exhumation finale, la fracturation et le faillage ainsi que la formation du relief sont fondamentaux dans l'évolution des chaînes de montagne. Ces dernières années, il est devenu clair que l'enregistrement thermochronologique dans les orogènes peut être influencé par le relief et réinitialisé par l'advection de la chaleur liée à la circulation de fluides géothermaux après le refroidissement initial. L'objectif de cette thèse est de reconstruire l'histoire tectono-thermique du massif de l'Aar dans les Alpes suisses Centrales à l'aide de trois thermochronomètres; (U-Th)/He sur zircon, (U-Th)/He sur apatite et les traces de fission sur apatite. Afin d'atteindre cet objectif, nous avons récolté un grand nombre d'échantillons provenant de différentes altitudes dans la vallée fortement incisée de Lötschental ainsi que du tunnel de NEAT. Cette stratégie d'échantillonnage nous a permis de contraindre de manière précise la chronologie, les quantités et les mécanismes d'exhumation de cette zone des Alpes Centrales, d'évaluer le rôle de la topographie sur l'enregistrement thermochronologique et de tester l'impact de l'hydrothermalisme sur les géochronomètres. Les échantillons ont été prélevés à des altitudes comprises entre 650 et 3930m selon 5 profils verticaux en surface et un dans le tunnel. Quand cela à été possible, les trois systèmes radiométriques ont été appliqués aux échantillons. Les âges (U-Th)\He obtenus sur zircons sont compris entre 5.l et 9.4 Ma et sont corrélés de manière positive avec l'altitude. Les graphiques représentant l'âge et l'élévation montrent une nette rupture de la pente qui traduisent un accroissement de la vitesse d'exhumation de 0.4 à 3 km\Ma il y a 6 Ma. Cette accélération de l'exhumation est confirmée par les vitesses de refroidissement de l'ordre de 100°C\Ma obtenus à partir des différents âges sur zircons et à partir des autres systèmes géochronologiques. Les données obtenues par traces de fission sur apatite nous indiquent également une exhumation rapide jusqu'à la fin du Miocène. Nous interprétons cet évènement à 6 Ma comme étant lié à l'uplift tectonique du massif de l'Aar. Le fait que cet évènement soit tardi-miocène implique qu'une augmentation des précipitations au Pliocène n'a pas engendré cette exhumation rapide du massif de l'Aar. La crise Messinienne de la mer méditerranée n'a pas pu avoir une incidence directe sur l'érosion du massif de l'Aar mais l'érosion associée à ce phénomène à pu réduire le coin orogénique alpin et causer la réactivation des chevauchements du massif de l'Aar. L'exhumation rapide Miocène a été suivie pas une diminution des taux d'exhumation lors des derniers 6 Ma (jusqu'à 1.3 km\Ma). Cependant, les âges (U-Th)\He sur apatite ainsi que les traces de fission sur apatite des échantillons du tunnel enregistrent une perturbation de l'enregistrement décrit ci-dessus. Les âges obtenus sur les apatites sont sensiblement plus jeunes sous l'axe de la vallée en comparaison du profil d'âges attendus. En effet, on attendrait des âges plus jeunes sous les parties les plus profondes du tunnel à cause de l'advection de la chaleur dans les flancs de la vallée. La vallée est creusée dans des schistes alors que les flancs de celle-ci sont constitués de granitoïdes plus durs. En accord avec les observations hydrologiques du tunnel, nous suggérons que la perméabilité élevée des roches sous l'axe de la vallée à permi l'infiltration de fluides géothermaux qui a généré un réchauffement des roches. Ce réchauffement aurait donc induit une perte d'Hélium et un recuit des traces de fission dans les apatites. Ceci résulterait en un rajeunissement des âges apatite et en une sous-estimation des vitesses d'exhumation sous l'axe de la vallée. Cette étude à servi à démontrer la nécessité d'un échantillonnage fin et précis afin d'apporter des données thermochronologiques de qualité dans le contexte alpin. Cette approche multi-système nous a permi de contrôler la pertinence des données acquises ainsi que d'identifier les sources possibles d'erreurs lors d'études thermochronologiques. RÉSUMÉ LARGE PUBLIC Lors d'une orogenèse, les roches subissent un cycle comprenant une subduction, de la déformation, du métamorphisme et, finalement, un retour à la surface (ou exhumation). L'exhumation résulte de la déformation au sein de la zone de collision, menant à un raccourcissement et un apaissessement de l'édifice rocheux, qui se traduit par une remontée des roches, création d'une topographie et érosion. Puisque l'érosion agit comme un racloir sur la partie supérieure de l'édifice, des tentatives de corrélation entre les épisodes d'exhumation rapide et les périodes d'érosion intensive, dues aux changements climatiques, ont été effectuées. La connaissance de la chronologie et du lieu précis est d'une importance capitale pour une quelconque reconstruction de l'évolution d'une chaîne de montagne. Ces critères sont donnés par un retraçage des changements de la température de la roche en fonction du temps, nous donnant le taux de refroidissement. L'instant auquel les roches ont refroidit, passant une certaine température, est contraint par l'application de techniques de datation par radiométrie. Ces méthodes reposent sur la désintégration des isotopes radiogéniques, tels que l'uranium et le potassium, tous deux abondants dans les roches de la croûte terrestre. Les produits de cette désintégration ne sont pas retenus dans les minéraux hôtes jusqu'au moment du refroidissement de la roche sous une température appelée 'de fermeture' , spécifique à chaque système de datation. Par exemple, la désintégration radioactive des atomes d'uranium et de thorium produit des atomes d'hélium qui s'échappent d'un cristal de zircon à des températures supérieures à 200°C. En mesurant la teneur en uranium-parent, l'hélium accumulé et en connaissant le taux de désintégration, il est possible de calculer à quel moment la roche échantillonnée est passée sous la température de 200°C. Si le gradient géothermal est connu, les températures de fermeture peuvent être converties en profondeurs actuelles (p. ex. 200°C ≈ 7km), et le taux de refroidissement en taux d'exhumation. De plus, en datant par système radiométrique des échantillons espacés verticalement, il est possible de contraindre directement le taux d'exhumation de la section échantillonnée en observant les différences d'âges entre des échantillons voisins. Dans les Alpes suisses, le massif de l'Aar forme une structure orographique majeure. Avec des altitudes supérieures à 4000m et un relief spectaculaire de plus de 2000m, le massif domine la partie centrale de la chaîne de montagne. Les roches aujourd'hui exposées à la surface ont été enfouies à plus de 10 km de profond il y a 20 Ma, mais la topographie actuelle du massif de l'Aar semble surtout s'être développée par un soulèvement actif depuis quelques millions d'années, c'est-à-dire depuis le Néogène supérieur. Cette période comprend un changement climatique soudain ayant touché l'Europe il y a environ 5 Ma et qui a occasionné de fortes précipitations, entraînant certainement une augmentation de l'érosion et accélérant l'exhumation des Alpes. Dans cette étude, nous avons employé le système de datation (U-TH)/He sur zircon, dont la température de fermeture de 200°C est suffisamment basse pour caractériser l'exhumation du Néogène sup. /Pliocène. Les échantillons proviennent du Lötschental et du tunnel ferroviaire le plus profond du monde (NEAT) situé dans la partie ouest du massif de l'Aar. Considérés dans l'ensemble, ces échantillons se répartissent sur un dénivelé de 3000m et des âges de 5.1 à 9.4 Ma. Les échantillons d'altitude supérieure (et donc plus vieux) documentent un taux d'exhumation de 0.4 km/Ma jusqu'à il y a 6 Ma, alors que les échantillons situés les plus bas ont des âges similaires allant de 6 à 5.4 Ma, donnant un taux jusqu'à 3km /Ma. Ces données montrent une accélération dramatique de l'exhumation du massif de l'Aar il y a 6 Ma. L'exhumation miocène sup. du massif prédate donc le changement climatique Pliocène. Cependant, lors de la crise de salinité d'il y a 6-5.3 Ma (Messinien), le niveau de la mer Méditerranée est descendu de 3km. Un tel abaissement de la surface d'érosion peut avoir accéléré l'exhumation des Alpes, mais le bassin sud alpin était trop loin du massif de l'Aar pour influencer son érosion. Nous arrivons à la conclusion que la datation (U-Th)/He permet de contraindre précisément la chronologie et l'exhumation du massif de l'Aar. Concernant la dualité tectonique-érosion, nous suggérons que, dans le cas du massif de l'Aar, la tectonique prédomine.
Resumo:
Over thirty years ago, Leamer (1983) - among many others - expressed doubts about the quality and usefulness of empirical analyses for the economic profession by stating that "hardly anyone takes data analyses seriously. Or perhaps more accurately, hardly anyone takes anyone else's data analyses seriously" (p.37). Improvements in data quality, more robust estimation methods and the evolution of better research designs seem to make that assertion no longer justifiable (see Angrist and Pischke (2010) for a recent response to Leamer's essay). The economic profes- sion and policy makers alike often rely on empirical evidence as a means to investigate policy relevant questions. The approach of using scientifically rigorous and systematic evidence to identify policies and programs that are capable of improving policy-relevant outcomes is known under the increasingly popular notion of evidence-based policy. Evidence-based economic policy often relies on randomized or quasi-natural experiments in order to identify causal effects of policies. These can require relatively strong assumptions or raise concerns of external validity. In the context of this thesis, potential concerns are for example endogeneity of policy reforms with respect to the business cycle in the first chapter, the trade-off between precision and bias in the regression-discontinuity setting in chapter 2 or non-representativeness of the sample due to self-selection in chapter 3. While the identification strategies are very useful to gain insights into the causal effects of specific policy questions, transforming the evidence into concrete policy conclusions can be challenging. Policy develop- ment should therefore rely on the systematic evidence of a whole body of research on a specific policy question rather than on a single analysis. In this sense, this thesis cannot and should not be viewed as a comprehensive analysis of specific policy issues but rather as a first step towards a better understanding of certain aspects of a policy question. The thesis applies new and innovative identification strategies to policy-relevant and topical questions in the fields of labor economics and behavioral environmental economics. Each chapter relies on a different identification strategy. In the first chapter, we employ a difference- in-differences approach to exploit the quasi-experimental change in the entitlement of the max- imum unemployment benefit duration to identify the medium-run effects of reduced benefit durations on post-unemployment outcomes. Shortening benefit duration carries a double- dividend: It generates fiscal benefits without deteriorating the quality of job-matches. On the contrary, shortened benefit durations improve medium-run earnings and employment possibly through containing the negative effects of skill depreciation or stigmatization. While the first chapter provides only indirect evidence on the underlying behavioral channels, in the second chapter I develop a novel approach that allows to learn about the relative impor- tance of the two key margins of job search - reservation wage choice and search effort. In the framework of a standard non-stationary job search model, I show how the exit rate from un- employment can be decomposed in a way that is informative on reservation wage movements over the unemployment spell. The empirical analysis relies on a sharp discontinuity in unem- ployment benefit entitlement, which can be exploited in a regression-discontinuity approach to identify the effects of extended benefit durations on unemployment and survivor functions. I find evidence that calls for an important role of reservation wage choices for job search be- havior. This can have direct implications for the optimal design of unemployment insurance policies. The third chapter - while thematically detached from the other chapters - addresses one of the major policy challenges of the 21st century: climate change and resource consumption. Many governments have recently put energy efficiency on top of their agendas. While pricing instru- ments aimed at regulating the energy demand have often been found to be short-lived and difficult to enforce politically, the focus of energy conservation programs has shifted towards behavioral approaches - such as provision of information or social norm feedback. The third chapter describes a randomized controlled field experiment in which we discuss the effective- ness of different types of feedback on residential electricity consumption. We find that detailed and real-time feedback caused persistent electricity reductions on the order of 3 to 5 % of daily electricity consumption. Also social norm information can generate substantial electricity sav- ings when designed appropriately. The findings suggest that behavioral approaches constitute effective and relatively cheap way of improving residential energy-efficiency.
Resumo:
Background: To enhance our understanding of complex biological systems like diseases we need to put all of the available data into context and use this to detect relations, pattern and rules which allow predictive hypotheses to be defined. Life science has become a data rich science with information about the behaviour of millions of entities like genes, chemical compounds, diseases, cell types and organs, which are organised in many different databases and/or spread throughout the literature. Existing knowledge such as genotype - phenotype relations or signal transduction pathways must be semantically integrated and dynamically organised into structured networks that are connected with clinical and experimental data. Different approaches to this challenge exist but so far none has proven entirely satisfactory. Results: To address this challenge we previously developed a generic knowledge management framework, BioXM™, which allows the dynamic, graphic generation of domain specific knowledge representation models based on specific objects and their relations supporting annotations and ontologies. Here we demonstrate the utility of BioXM for knowledge management in systems biology as part of the EU FP6 BioBridge project on translational approaches to chronic diseases. From clinical and experimental data, text-mining results and public databases we generate a chronic obstructive pulmonary disease (COPD) knowledge base and demonstrate its use by mining specific molecular networks together with integrated clinical and experimental data. Conclusions: We generate the first semantically integrated COPD specific public knowledge base and find that for the integration of clinical and experimental data with pre-existing knowledge the configuration based set-up enabled by BioXM reduced implementation time and effort for the knowledge base compared to similar systems implemented as classical software development projects. The knowledgebase enables the retrieval of sub-networks including protein-protein interaction, pathway, gene - disease and gene - compound data which are used for subsequent data analysis, modelling and simulation. Pre-structured queries and reports enhance usability; establishing their use in everyday clinical settings requires further simplification with a browser based interface which is currently under development.
Resumo:
The increasing volume of data describing humandisease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the@neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system’s architecture is generic enough that it could be adapted to the treatment of other diseases.Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers cliniciansthe tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medicalresearchers gain access to a critical mass of aneurysm related data due to the system’s ability to federate distributed informationsources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access andwork on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand forperforming computationally intensive simulations for treatment planning and research.
Resumo:
In this article we present a hybrid approach for automatic summarization of Spanish medical texts. There are a lot of systems for automatic summarization using statistics or linguistics, but only a few of them combining both techniques. Our idea is that to reach a good summary we need to use linguistic aspects of texts, but as well we should benefit of the advantages of statistical techniques. We have integrated the Cortex (Vector Space Model) and Enertex (statistical physics) systems coupled with the Yate term extractor, and the Disicosum system (linguistics). We have compared these systems and afterwards we have integrated them in a hybrid approach. Finally, we have applied this hybrid system over a corpora of medical articles and we have evaluated their performances obtaining good results.
Resumo:
Migration-related issues have, since approximately 2000, been the object of increased attention at the international level. This has led, among other things, to the production of international narratives, which aim both at understanding migration and at proposing policy recommendations on how to address it, with the objective of improving the governance of migration at the global level. But this implies overcoming dilemmas stemming from the diverging interests of states and other actors (like NGOs and the private sector). This article examines the way in which international migration narratives address skilled migration, which is characterised by some of the clearest political trade-offs between stakeholders. It argues that these narratives attempt to speak to all parties and conciliate contradictory arguments about what should be done, in order to discursively overcome policy dilemmas and create a consensus. While this is line with the mandate of international organizations, it depoliticises migration issues.
Resumo:
This research investigates the phenomenon of translationese in two monolingual comparable corpora of original and translated Catalan texts. Translationese has been defined as the dialect, sub-language or code of translated language. This study aims at giving empirical evidence of translation universals regardless the source language.Traditionally, research conducted on translation strategies has been mainly intuition-based. Computational Linguistics and Natural Language Processing techniques provide reliable information of lexical frequencies, morphological and syntactical distribution in corpora. Therefore, they have been applied to observe which translation strategies occur in these corpora.Results seem to prove the simplification, interference and explicitation hypotheses, whereas no sign of normalization has been detected with the methodology used.The data collected and the resources created for identifying lexical, morphological and syntactic patterns of translations can be useful for Translation Studies teachers, scholars and students: teachers will have more tools to help students avoid the reproduction of translationese patterns. Resources developed will help in detecting non-genuine or inadequate structures in the target language. This fact may imply an improvement in stylistic quality in translations. Translation professionals can also take advantage of these resources to improve their translation quality.
Resumo:
OBJECTIVETo describe the phenomenon of child neglect and understand it in a gender context.METHODRetrospective, quantitative, and exploratory study that analyzed reports of violence by a child and adolescent protection network in a Brazilian city. The theoretical and methodological basis applied was TIPESC (Theory of Nursing Praxical Intervention in Collective Health), with a gender emphasis.RESULTSNeglect of children under the age of ten represents more than half the reports received over all the years studied; more boys than girls suffered neglect and 41.4% of the reports of neglect involved children under than age of three; women were identified as being solely responsible in 67.9%, and as accessories in 17.3% of the incidents reported.CONCLUSIONChild neglect is a complex matter, the gender subordinate status inflicted on these children and their mothers who are responsible for their care underscore the social vulnerability of this group.
Resumo:
Medicine counterfeiting is a serious worldwide issue, involving networks of manufacture and distribution that are an integral part of industrialized organized crime. Despite the potentially devastating health repercussions involved, legal sanctions are often inappropriate or simply not applied. The difficulty in agreeing on a definition of counterfeiting, the huge profits made by the counterfeiters and the complexity of the market are the other main reasons for the extent of the phenomenon. Above all, international cooperation is needed to thwart the spread of counterfeiting. Moreover effort is urgently required on the legal, enforcement and scientific levels. Pharmaceutical companies and agencies have developed measures to protect the medicines and allow fast and reliable analysis of the suspect products. Several means, essentially based on chromatography and spectroscopy, are now at the disposal of the analysts to enable the distinction between genuine and counterfeit products. However the determination of the components and the use of analytical data for forensic purposes still constitute a challenge. The aim of this review article is therefore to point out the intricacy of medicine counterfeiting so that a better understanding can provide solutions to fight more efficiently against it.
Resumo:
LEGISLATIVE STUDY – The 83rd General Assembly of the Iowa Legislature, in Senate File 2273, directed the Iowa Department of Transportation (DOT) to conduct a study of how to implement a uniform statewide system to allow for electronic transactions for the registration and titling of motor vehicles. PARTICIPANTS IN STUDY – As directed by Senate File 2273, the DOT formed a working group to conduct the study that included representatives from the Consumer Protection Division of the Office of the Attorney General, the Department of Public Safety, the Department of Revenue, the Iowa State County Treasurer’s Association, the Iowa Automobile Dealers Association, and the Iowa Independent Automobile Dealers Association. CONDUCT OF THE STUDY – The working group met eight times between June 17, 2010, and October 1, 2010. The group discussed the costs and benefits of electronic titling from the perspectives of new and used motor vehicle dealers, county treasurers, the DOT, lending institutions, consumers and consumer protection, and law enforcement. Security concerns, legislative implications, and implementation timelines were also considered. In the course of the meetings the group: 1. Reviewed the specific goals of S.F. 2273, and viewed a demonstration of Iowa’s current vehicle registration and titling system so participants that were not users of the system could gain an understanding of its current functionality and capabilities. 2. Reviewed the results of a survey of county treasurers conducted by the DOT to determine the extent to which county treasurers had processing backlogs and the extent to which county treasurers limited the number of dealer registration and titling transactions that they would process in a single day and while the dealer waited. Only eight reported placing a limit on the number of dealer transactions that would be processed while the dealer waited (with the number ranging from one to four), and only 11 reported a backlog in processing registration and titling transactions as of June 11, 2010, with most backlogs being reported in the range of one to three days. 3. Conducted conference calls with representatives of the American Association of Motor Vehicle Administrators (AAMVA) and representatives of three states -- Kansas, which has an electronic lien and titling (ELT) program, and Wisconsin and Florida, each of which have both an ELT program and an electronic registration and titling (ERT) program – to assess current and best practices for electronic transactions. In addition, the DOT (through AAMVA) submitted a survey to all U.S. jurisdictions to determine how, if at all, other states implemented electronic transactions for the registration and titling of motor vehicles. Twenty-eight states responded to the survey; of the 28 states that responded, only 13 allowed liens to be added or released electronically, and only five indicated allowing applications for registration and titling to be submitted electronically. DOT staff also heard a presentation from South Dakota on its ERT system at an AAMVA regional meeting. ELT information that emerged suggests a multi-vendor approach, in which vendors that meet state specifications for participation are authorized to interface with the state’s system to serve as a portal between lenders and the state system, will facilitate electronic lien releases and additions by offering lenders more choices and the opportunity to use the same vendor in multiple states. The ERT information that emerged indicates a multi-interface approach that offers an interface with existing dealer management software (DMS) systems and through a separate internet site will facilitate ERT by offering access that meets a variety of business needs and models. In both instances, information that emerged indicates that, in the long-term, adoption rates are positively affected by making participation above a certain minimum threshold mandatory. 4. To assess and compare functions or services that might be offered by or through a vendor, the group heard presentations from vendors that offer products or services that facilitate some aspect of ELT or ERT. 5. To assess the concerns, needs and interest of Iowa motor vehicle dealers, the group surveyed dealers to assess registration and titling difficulties experienced by dealers, the types of DMS systems (if any) used by dealers, and the dealers’ interest and preference in using an electronic interface to submit applications for registration and titling. Overall, 40% of the dealers that responded indicated interest and 57% indicated no interest, but interest was pronounced among new car dealers (75% were interested) and dealers with a high number of monthly transactions (85% of dealers averaging more than 50 sales per month were interested). The majority of dealers responding to the dealer survey ranked delays in processing and problems with daily limits on transaction as ―minor difficulty or ―no difficulty. RECOMMENDATIONS -- At the conclusion of the meetings, the working group discussed possible approaches for implementation of electronic transactions in Iowa and reached a consensus that a phased implementation of electronic titling that addressed first electronic lien and title transactions (ELT) and electronic fund transfers (EFT), and then electronic applications for registration and titling (ERT) is recommended. The recommendation of a phased implementation is based upon recognition that aspects of ELT and EFT are foundational to ERT, and that ELT and EFT solutions are more readily and easily attained than the ERT solution, which will take longer and be somewhat more difficult to develop and will require federal approval of an electronic odometer statement to fully implement. ELT – A multi-vendor approach is proposed for ELT. No direct costs to the state, counties, consumers, or dealers are anticipated under this approach. The vendor charges participating lenders user or transaction fees for the service, and it appears the lenders typically absorb those costs due to the savings offered by ELT. Existing staff can complete the programming necessary to interface the state system with vendors’ systems. The estimated time to implement ELT is six to nine months. Mandatory participation is not recommended initially, but should be considered after ELT has been implemented and a suitable number of vendors have enrolled to provide a fair assessment of participation rates and opportunities. EFT – A previous attempt to implement ELT and EFT was terminated due to concern that it would negatively impact county revenues by reducing interest income earned on state funds collected by the county and held until the monthly transfer to the state. To avoid that problem in this implementation, the EFT solution should remain revenue neutral to the counties, by allowing fees submitted by EFT to be immediately directed to the proper county account. Because ARTS was designed and has the capacity to accommodate EFT, a vendor is not needed to implement EFT. The estimated time to implement EFT is six to nine months. It is expected that EFT development will overlap ELT development. ERT – ERT itself must be developed in phases. It will not be possible to quickly implement a fully functioning, paperless ERT system, because federal law requires that transfer of title be accompanied by a written odometer statement unless approval for an alternate electronic statement is granted by the National Highway Traffic Safety Administration (NHTSA). It is expected that it will take as much as a year or more to obtain NHTSA approval, and that NHTSA approval will require design of a system that requires the seller to electronically confirm the seller’s identity, make the required disclosure to the buyer, and then transfer the disclosure to the buyer, who must also electronically confirm the buyer’s identity and electronically review and accept the disclosure to complete and submit the transaction. Given the time that it will take to develop and gain approval for this solution, initial ERT implementation will focus on completing and submitting applications and issuing registration applied for cards electronically, with the understanding that this process will still require submission of paper documents until an electronic odometer solution is developed. Because continued submission of paper documents undermines the efficiencies sought, ―full‖ ERT – that is, all documents necessary for registration and titling should be capable of approval and/or acceptance by all parties, and should be capable of submission without transmittal or delivery of duplicate paper documents .– should remain the ultimate goal. ERT is not recommended as a means to eliminate review and approval of registration and titling transactions by the county treasurers, or to place registration and titling approval in the hands of the dealers, as county treasurers perform an important role in deterring fraud and promoting accuracy by determining the genuineness and regularity of each application. Authorizing dealers to act as registration agents that approve registration and title applications, issue registration receipts, and maintain and deliver permanent metal license plates is not recommended. Although distribution of permanent plates by dealers is not recommended, it is recommended that dealers participating in ERT generate and print registration applied for cards electronically. Unlike the manually-issued cards currently in use, cards issued in this fashion may be queried by law enforcement and are less susceptible to misuse by customers and dealers. The estimated time to implement the electronic application and registration applied for cards is 12 to 18 months, to begin after ELT and EFT have been implemented. It is recommended that focus during this time be on facilitating transfers through motor vehicle dealers, with initial deployment focused on higher-volume dealers that use DMS systems. In the long term an internet option for access to ERT must also be developed and maintained to allow participation for lower-volume dealers that do not use a DMS system. This option will also lay the ground work for an ERT option for sales between private individuals. Mandatory participation in Iowa is not recommended initially. As with ELT, it is recommended that mandatory participation be considered after at least an initial phase of ERT has been implemented and a suitable number of dealers have enrolled to provide a fair assessment of participation rates and opportunities. The use of vendors to facilitate ERT is not initially proposed because 1) DOT IT support staff is capable of developing a system that will interact with DMS systems and will still have to develop a dealer and public interface regardless of whether a vendor acts as intermediary between the DMS systems, and 2) there is concern that the cost of the vendor-based system, which is funded by transaction-based payments from the dealer to the vendor, will be passed to the consumer in the form of additional documentation or conveyance fees. However, the DOT recommends flexibility on this point, as development and pilot of the system may indicate that a multi-vendor approach similar to that recommended for ELT may increase the adoption rate by larger dealers and may ultimately decrease the user management to be exercised by DOT staff. If vendors are used in the process, additional legislation or administrative rules may be needed to control the fees that may be passed to the consumer. No direct cost to the DOT or county treasurers is expected, as the DOT expects that it may complete necessary programming with existing staff. Use of vendors to facilitate ERT transactions by dealers using DMS systems would result in transaction fees that may ultimately be passed to consumers. LEGISLATION – As a result of the changes implemented in 2004 under Senate File 2070, the only changes to Iowa statutes proposed are to section 321.69 of the Iowa Code, ―Damage disclosure statement,and section 321.71, ―Odometer requirements.‖ In each instance, authority to execute these statements by electronic means would be clarified by authorizing language similar to that used in section 321.20, subsections ―2‖ and ―3,‖ which allows for electronic applications and directs the department to ―adopt rules on the method for providing signatures for applications made by electronic means.‖ In these sections, the authorizing language might read as follows: Notwithstanding contrary provisions of this section, the department may develop and implement a program to allow for any statement required by this section to be made electronically. The department shall adopt rules on the method for providing signatures for statements made by electronic means. Some changes to DOT administrative rules will be useful but only to enable changes to work processes that would be desirable in the long term. Examples of long term work processes that would be enabled by rule changes include allowing for signatures created through electronic means and electronic odometer certifications. The DOT rules, as currently written, do not hinder the ability to proceed with ELT, EFT, and ERT.
Resumo:
S'inscrivant dans le domaine de l'analyse des relations temporelles dans les textes, la présente étude est consacrée à la notion du futur définie en tant qu'anticipation sur les événements à venir dans le récit. Ainsi, la recherche en question se propose de mettre en lumière les différents mécanismes d'anticipation propres aux récits d'aventures au Moyen Âge. Recourant aux moyens heuristiques existants (les bases de la théorie de la réception telle qu'elle est représentée dans les travaux de Jauss, Eco et Greimas), cette thèse se concentre sur l'étude du prologue de l'oeuvre littéraire dont elle élabore une grille de lecture particulière qui tient compte de la complexité de la notion du futur envisagée aux plans grammatical (formes verbales du futur), rhétorique (la figure de la prolepse) et littéraire (les scénarios et les isotopies). La démarche de la lecture détaillée du prologue adoptée au cours de ce travail s'applique d'abord au Chevalier au Lion de Chrétien de Troyes, texte fondateur du corpus, qui constitue, pour parler avec Philippe Walter, un vrai « drame du temps ». L'étude des mécanismes d'anticipation mis en place dans le prologue se prolonge ensuite dans les chapitres consacrés à Claris et Laris et au Chevalier au Lion de Pierre Sala, deux réécritures du célèbre roman du maître champenois. Datant, respectivement, du XIIIe et du début du XVIe siècle, ces oeuvres permettent de saisir le chemin parcouru par le futur dans son aspect thématique et diachronique, ce qui est particulièrement propice au repérage des critères qui influencent l'attente du lecteur par rapport aux événements à venir au fil des siècles. Ainsi, à côté des moyens d'expression « standards » du futur (scénarios intertextuels et isotopies), la présente recherche fait apparaître d'autres facteurs qui influencent l'anticipation, notamment le procédé de la disputatio, le recours à la satire, la démarche de l'engagement indirect et l'opposition vers/prose. La seconde partie de la thèse qui traite, d'un côté, des récits consacrés à la fée Mélusine de Jean d'Arras et de Coudrette, du Livre du Cuer d'amours espris de René d'Anjou de l'autre, sert à vérifier dans quelle mesure la démarche choisie s'applique à des oeuvres du Moyen Âge tardif qui combinent des éléments empruntés à la tradition antérieure avec des éléments d'autre provenance. L'analyse de Mélusine et du Livre du Cuer conduit à ajouter deux facteurs supplémentaires qui influencent l'attente du lecteur, à savoir la démarche de l'engagement partiel et le recours au genre judiciaire. Cette étude démontre que le traitement du futur est d'un enjeu capital pour lire les textes du Moyen Âge, car il permet au lecteur, dès le prologue, de reconnaître la fin vers laquelle tend le récit et de faire par là- même une lecture enrichie, supérieure à d'autres. Telling the Future in the Middle Ages : from the Prologue to the Narrative. From Chrétien de Troyes ' Chevalier au Lion to Pierre Sala 's Chevalier au Lion. Part of the analyses of the time based relations within the texts, the present study deals with the notion of future characterized as an anticipation of the events to come in the narrative. Therefore, the purpose of this research is to bring to light the various mechanisms of anticipation peculiar to the narratives of adventures in the Middle Ages. Making the use of the existing heuristic instruments (the bases of the theory of the reception as it presents itself in the works by Jauss, Eco and Greimas), this thesis is dedicated to the study of the prologue of the work of fiction by means of a particular key for reading which takes into account the complexity of the notion of future with regard to its grammatical side (future tenses), to its rhetorical side (prolepse) and to its literary side (scenario and isotopy). First of all, the close reading of the prologue used in this work is applied to Chrétien de Troyes's Chevalier au Lion (Lion Knight), the founding text of our literary corpus, which represents, according to Philippe Walter, the real « drama of Time ». Then, the study of the mechanisms of anticipation in the prologue is carried over to the chapters devoted to Claris et Laris and to Pierre Sala's Chevalier au Lion, two romances that rewrite Chrétien de Troyes' famous work. Written, respectively, in the XIIIth and in the beginning of the XVI century, these romances enable the reader to ascertain the changes in the manner of telling the future from the thematic and diachronic point of view : this is particularly convenient to the identification of the criteria which influence the reader's expectations relative to the future events in the course of the centuries. Therefore, next to the « standard » means of expression of future (intertextual scenario and isotopy), the present study reveals other factors which influence the anticipation, in particular the method of the disputatio, the use of the satire, the approach of the indirect commitment and the verse/prose opposition. The second part of the thesis which deals with the narratives concerning the fairy Melusine written by Jean d'Arras and by Coudrette on one hand, with René d'Anjou's Livre du Cuer d'amours espris on the other hand, is used to verify to what extent the chosen approach applies to the works of fiction of the Late Middle Ages that combine the elements from the previous tradition with the elements of other origin. The analysis of Melusine's romances and of the Livre du Cuer brings us to add two new factors which influence the reader's expectations : the approach of the partial commitment, and the use of the legal discourse. This study demonstrates that the manner of telling the future is of the utmost importance to read the texts of the Middle Ages, because it enables the reader to know the end of the story from the very beginning, from the prologue, thus leading to a richer and superior reading.
Resumo:
Sophisticated magnetic resonance tagging techniques provide powerful tools for the non-invasive assessment of the local heartwall motion towards a deeper fundamental understanding of local heart function. For the extraction of motion data from the time series of magnetic resonance tagged images and for the visualization of the local heartwall motion a new image analysis procedure has been developed. New parameters have been derived which allows quantification of the motion patterns and are highly sensitive to any changes in these patterns. The new procedure has been applied for heart motion analysis in healthy volunteers and in patient collectives with different heart diseases. The achieved results are summarized and discussed.
Resumo:
The region of greatest variability on soil maps is along the edge of their polygons, causing disagreement among pedologists about the appropriate description of soil classes at these locations. The objective of this work was to propose a strategy for data pre-processing applied to digital soil mapping (DSM). Soil polygons on a training map were shrunk by 100 and 160 m. This strategy prevented the use of covariates located near the edge of the soil classes for the Decision Tree (DT) models. Three DT models derived from eight predictive covariates, related to relief and organism factors sampled on the original polygons of a soil map and on polygons shrunk by 100 and 160 m were used to predict soil classes. The DT model derived from observations 160 m away from the edge of the polygons on the original map is less complex and has a better predictive performance.
Resumo:
Introduction. Development of the fetal brain surfacewith concomitant gyrification is one of the majormaturational processes of the human brain. Firstdelineated by postmortem studies or by ultrasound, MRIhas recently become a powerful tool for studying in vivothe structural correlates of brain maturation. However,the quantitative measurement of fetal brain developmentis a major challenge because of the movement of the fetusinside the amniotic cavity, the poor spatial resolution,the partial volume effect and the changing appearance ofthe developing brain. Today extensive efforts are made todeal with the âeurooepost-acquisitionâeuro reconstruction ofhigh-resolution 3D fetal volumes based on severalacquisitions with lower resolution (Rousseau, F., 2006;Jiang, S., 2007). We here propose a framework devoted tothe segmentation of the basal ganglia, the gray-whitetissue segmentation, and in turn the 3D corticalreconstruction of the fetal brain. Method. Prenatal MRimaging was performed with a 1-T system (GE MedicalSystems, Milwaukee) using single shot fast spin echo(ssFSE) sequences in fetuses aged from 29 to 32gestational weeks (slice thickness 5.4mm, in planespatial resolution 1.09mm). For each fetus, 6 axialvolumes shifted by 1 mm were acquired (about 1 min pervolume). First, each volume is manually segmented toextract fetal brain from surrounding fetal and maternaltissues. Inhomogeneity intensity correction and linearintensity normalization are then performed. A highspatial resolution image of isotropic voxel size of 1.09mm is created for each fetus as previously published byothers (Rousseau, F., 2006). B-splines are used for thescattered data interpolation (Lee, 1997). Then, basalganglia segmentation is performed on this superreconstructed volume using active contour framework witha Level Set implementation (Bach Cuadra, M., 2010). Oncebasal ganglia are removed from the image, brain tissuesegmentation is performed (Bach Cuadra, M., 2009). Theresulting white matter image is then binarized andfurther given as an input in the Freesurfer software(http://surfer.nmr.mgh.harvard.edu/) to provide accuratethree-dimensional reconstructions of the fetal brain.Results. High-resolution images of the cerebral fetalbrain, as obtained from the low-resolution acquired MRI,are presented for 4 subjects of age ranging from 29 to 32GA. An example is depicted in Figure 1. Accuracy in theautomated basal ganglia segmentation is compared withmanual segmentation using measurement of Dice similarity(DSI), with values above 0.7 considering to be a verygood agreement. In our sample we observed DSI valuesbetween 0.785 and 0.856. We further show the results ofgray-white matter segmentation overlaid on thehigh-resolution gray-scale images. The results arevisually checked for accuracy using the same principlesas commonly accepted in adult neuroimaging. Preliminary3D cortical reconstructions of the fetal brain are shownin Figure 2. Conclusion. We hereby present a completepipeline for the automated extraction of accuratethree-dimensional cortical surface of the fetal brain.These results are preliminary but promising, with theultimate goal to provide âeurooemovieâeuro of the normal gyraldevelopment. In turn, a precise knowledge of the normalfetal brain development will allow the quantification ofsubtle and early but clinically relevant deviations.Moreover, a precise understanding of the gyraldevelopment process may help to build hypotheses tounderstand the pathogenesis of several neurodevelopmentalconditions in which gyrification have been shown to bealtered (e.g. schizophrenia, autismâeuro¦). References.Rousseau, F. (2006), 'Registration-Based Approach forReconstruction of High-Resolution In Utero Fetal MR Brainimages', IEEE Transactions on Medical Imaging, vol. 13,no. 9, pp. 1072-1081. Jiang, S. (2007), 'MRI of MovingSubjects Using Multislice Snapshot Images With VolumeReconstruction (SVR): Application to Fetal, Neonatal, andAdult Brain Studies', IEEE Transactions on MedicalImaging, vol. 26, no. 7, pp. 967-980. Lee, S. (1997),'Scattered data interpolation with multilevel B-splines',IEEE Transactions on Visualization and Computer Graphics,vol. 3, no. 3, pp. 228-244. Bach Cuadra, M. (2010),'Central and Cortical Gray Mater Segmentation of MagneticResonance Images of the Fetal Brain', ISMRM Conference.Bach Cuadra, M. (2009), 'Brain tissue segmentation offetal MR images', MICCAI.