943 resultados para Digital terrain analysis
Resumo:
Spectrographic analysis of limestones as a possible method of correlation of geologic formations is an altogether new line of investigation. As far as known the only previous work consists of a few analyses made by Fred Lines in his bachelor thesis work at Montana School of mines in the spring of 1942.
Resumo:
Stable Isotope Ratio Analysis (SIRA) is the measurement of variation in different isotopes of same elements in a material. This technique is well-established in the natural sciences and has been long part of the methodological arsenal in fields such as geology and biology. More recently this technique has begun to be utilized in the social sciences, moving from initial applications in anthropology to potential uses in geography, public health, forensic science, and others. This presentation will discuss the techniques behind SIRA, examples of current applications in the natural and social sciences, and potential avenues of future research.
Resumo:
Vegetation phenology is an important indicator of climate change and climate variability and it is strongly connected to biospheric–atmospheric gas exchange. We aimed to evaluate the applicability of phenological information derived from digital imagery for the interpretation of CO2 exchange measurements. For the years 2005–2007 we analyzed seasonal phenological development of 2 temperate mixed forests using tower-based imagery from standard RGB cameras. Phenological information was jointly analyzed with gross primary productivity (GPP) derived from net ecosystem exchange data. Automated image analysis provided reliable information on vegetation developmental stages of beech and ash trees covering all seasons. A phenological index derived from image color values was strongly correlated with GPP, with a significant mean time lag of several days for ash trees and several weeks for beech trees in early summer (May to mid-July). Leaf emergence dates for the dominant tree species partly explained temporal behaviour of spring GPP but were also masked by local meteorological conditions. We conclude that digital cameras at flux measurement sites not only provide an objective measure of the physiological state of a forest canopy at high temporal and spatial resolutions, but also complement CO2 and water exchange measurements, improving our knowledge of ecosystem processes.
Resumo:
Der Artikel behandelt das Projektieren der Produkt-Service-Verbindung vom Standpunkt der Informationsintegration aus. Der Autor erläutert grundlegende Unterschiede zwischen dem traditionellen und dem modernen Operationsmanagementkonzept. Ergänzend wird die Rolle der logistischen Unterstüzungsanalyse wird betrachtet. Der Artikel stellt das Konzept von CALS (Continuous Acquisition and Life cycle Support) dar, welches als Umgebung, die Datenverteilung zwischen den in den Entwicklungsprozess beteiligten Geschäftspartnern ermöglicht.
Resumo:
This article provides a legal and economic analysis of private copying levies in the EU, against the background of the Copyright Directive (2001/29), a number of recent rulings by the European Court of Justice and the recommendations presented by mediator Vitorino earlier this year. It concludes that notwithstanding these rulings and recommendations, there remains a lack of concordance on the relevance of contractual stipulations and digital rights management technologies (DRM) for setting levies, and the concept of harm. While Mr Vitorino and AG Sharpston (in the Opinion preceding VG Wort v. Kyocera) use different lines of reasoning to argue that levies raised on authorised copies would lead to double payment, the Court of Justice’s decision in VG Wort v. Kyocera seems to conclude that such copies should nonetheless be levied. If levies are to provide fair compensation for harm resulting from acts of private copying, economic analysis suggests one should distinguish between various kinds of private copies and take account of the extent to which the value said copies have for consumers can be priced into the purchase. Given the availability of DRM (including technical protection measures), the possibility of such indirect appropriation leads to the conclusion that the harm from most kinds of private copies is de minimis and gives no cause for levies. The user value of copies from unauthorised sources (e.g. from torrent networks or cyber lockers), on the other hand, cannot be appropriated indirectly by rightholders. It is, however, an open question in references for preliminary rulings pending at the Court of Justice whether these copies are included in the scope of the private copying exception or limitation and can thus be levied for. If they are not, as currently happens in several EU Member States, legal and economic analysis leads to the conclusion that the scope of private copying acts giving rise to harm susceptible of justifying levies is gradually diminishing.
Resumo:
In light of the recent European Court of Justice ruling (ECJ C-131/12, Google Spain v Spanish Data Protection Agency),the “right to be forgotten” has once again gained worldwide media attention. Already in 2012, whenthe European Commission proposed aright to be forgotten,this proposal received broad public interest and was debated intensively. Under certain conditions, individuals should thereby be able todelete personal data concerning them. More recently – in light of the European Parliament’s approval of the LIBE Committee’samendments onMarch 14, 2014 – the concept seems tobe close to its final form.Although it remains, for the most part,unchanged from the previously circulated drafts, it has beenre-labelled as a“right of erasure”. This article argues that, despite its catchy terminology, the right to be forgotten can be understood as a generic term, bringing together existing legal provisions: the substantial right of oblivion and the rather procedural right to erasure derived from data protection. Hereinafter, the article presents an analysis of selected national legal frameworks and corresponding case law, accounting for data protection, privacy, and general tort law as well as defamation law. This comparative analysis grasps the practical challenges which the attempt to strengthen individual control and informational self-determination faces. Consequently, it is argued that narrowing the focus on the data protection law amendments neglects the elaborate balancing of conflicting interests in European legal tradition. It is shown thatthe attemptto implement oblivion, erasure and forgetting in the digital age is a complex undertaking.
Resumo:
The purpose of the article is to provide first a doctrinal summary of the concept, rules and policy of exhaustion, first, on the international and EU level, and, later, under the law of the United States. Based upon this introduction, the paper turns to the analysis of the doctrine by the pioneer court decisions handed over in the UsedSoft, ReDigi, the German e-book/audio book cases, and the pending Tom Kabinet case from the Netherlands. Questions related to the licence versus sale dichotomy; the so-called umbrella solution; the “new copy theory”, migration of digital copies via the internet; the forward-and-delete technology; the issue of lex specialis and the theory of functional equivalence are covered later on. The author of the present article stresses that the answers given by the respective judges of the referred cases are not the final stop in the discussion. The UsedSoft preliminary ruling and the subsequent German domestic decisions highlight a special treatment for computer programs. On the other hand, the refusal of digital exhaustion in the ReDigi and the audio book/e-book cases might be in accordance with the present wording of copyright law; however, they do not necessarily reflect the proper trends of our ages. The paper takes the position that the need for digital exhaustion is constantly growing in society and amongst businesses. Indeed, there are reasonable arguments in favour of equalizing the resale of works sold in tangible and intangible format. Consequently, the paper urges the reconsideration of the norms on exhaustion on the international and EU level.
Resumo:
Nine Iowa State University veterinary medical students completed SPA records on herds from Iowa, North Dakota and South Dakota. The Iowa herds were included in the SPA summary for Iowa, but the six North and South Dakota herds were summarized separately. These six herds had an average herd size of 371 cows and had a financial return to capital, labor and management of $175 per cow. Total financial cost per cow averaged $286 for these herds with a range of $211 to $388. Feed utilized averaged 4,442 pounds of dry matter per cow and the average pounds of calf produced per exposed female was 506 pounds.
Resumo:
Stemmatology, or the reconstruction of the transmission history of texts, is a field that stands particularly to gain from digital methods. Many scholars already take stemmatic approaches that rely heavily on computational analysis of the collated text (e.g. Robinson and O’Hara 1996; Salemans 2000; Heikkilä 2005; Windram et al. 2008 among many others). Although there is great value in computationally assisted stemmatology, providing as it does a reproducible result and allowing access to the relevant methodological process in related fields such as evolutionary biology, computational stemmatics is not without its critics. The current state-of-the-art effectively forces scholars to choose between a preconceived judgment of the significance of textual differences (the Lachmannian or neo-Lachmannian approach, and the weighted phylogenetic approach) or to make no judgment at all (the unweighted phylogenetic approach). Some basis for judgment of the significance of variation is sorely needed for medieval text criticism in particular. By this, we mean that there is a need for a statistical empirical profile of the text-genealogical significance of the different sorts of variation in different sorts of medieval texts. The rules that apply to copies of Greek and Latin classics may not apply to copies of medieval Dutch story collections; the practices of copying authoritative texts such as the Bible will most likely have been different from the practices of copying the Lives of local saints and other commonly adapted texts. It is nevertheless imperative that we have a consistent, flexible, and analytically tractable model for capturing these phenomena of transmission. In this article, we present a computational model that captures most of the phenomena of text variation, and a method for analysis of one or more stemma hypotheses against the variation model. We apply this method to three ‘artificial traditions’ (i.e. texts copied under laboratory conditions by scholars to study the properties of text variation) and four genuine medieval traditions whose transmission history is known or deduced in varying degrees. Although our findings are necessarily limited by the small number of texts at our disposal, we demonstrate here some of the wide variety of calculations that can be made using our model. Certain of our results call sharply into question the utility of excluding ‘trivial’ variation such as orthographic and spelling changes from stemmatic analysis.
Resumo:
Aims: Arterial plaque rupture and thrombus characterise ST-elevation myocardial infarction (STEMI) and may aggravate delayed arterial healing following durable polymer drug-eluting stent (DP-DES) implantation. Biodegradable polymer (BP) may improve biocompatibility. We compared long-term outcomes in STEMI patients receiving BP-DES vs. durable polymer sirolimus-eluting stents (DP-SES). Methods and results: We pooled individual patient-level data from three randomised clinical trials (ISAR-TEST-3, ISAR-TEST-4 and LEADERS) comparing outcomes from BP-DES with DP-SES at four years. The primary endpoint (MACE) comprised cardiac death, MI, or target lesion revascularisation (TLR). Secondary endpoints were TLR, cardiac death or MI, and definite or probable stent thrombosis. Of 497 patients with STEMI, 291 received BP-DES and 206 DP-SES. At four years, MACE was significantly reduced following treatment with BP-DES (hazard ratio [HR] 0.59, 95% CI: 0.39-0.90; p=0.01) driven by reduced TLR (HR 0.54, 95% CI: 0.30-0.98; p=0.04). Trends towards reduction were seen for cardiac death or MI (HR 0.63, 95% CI: 0.37-1.05; p=0.07) and definite or probable stent thrombosis (3.6% vs. 7.1%; HR 0.49, 95% CI: 0.22-1.11; p=0.09). Conclusions: In STEMI, BP-DES demonstrated superior clinical outcomes to DP-SES at four years. Trends towards reduced cardiac death or myocardial infarction and reduced stent thrombosis require corroboration in specifically powered trials.
Resumo:
The paper seeks a re-conceptualization of the global digital divide debate. It critically explores the predominant notion, its evolution and measurement, as well as the policies that have been advanced to bridge the digital divide. Acknowledging the complexity of this inequality, the paper aims at analyzing the disparities beyond the connectivity and the skills barriers. Without understating the first two digital divides, it is argued that as the Internet becomes more sophisticated and more integrated into economic, social and cultural processes, a ‘third’ generation of divides becomes critical. These divides are drawn not at the entry to the net but within the net itself, and limit access to content. The increasing barriers to content, although of diverse nature, all relate to some governance characteristics inherent in cyberspace, such as global spillover of local decisions, regulation through code or proliferation of self- and co-regulatory models. It is maintained that as the practice of intervention intensifies in cyberspace, multiple and far-reaching points of control outside formal legal institutions are created, which threaten the availability of public goods and make the pursuit of public objectives difficult. This is an aspect that is rarely addressed in the global digital divide discussions, even in comprehensive analysis and political initiatives such as the World Summit on the Information Society. Yet, the conceptualization of the digital divide as impeded access to content may be key in terms of ensuring real participation and catering for the long-term implications of digital technologies.
Resumo:
BACKGROUND AND PURPOSE We report on workflow and process-based performance measures and their effect on clinical outcome in Solitaire FR Thrombectomy for Acute Revascularization (STAR), a multicenter, prospective, single-arm study of Solitaire FR thrombectomy in large vessel anterior circulation stroke patients. METHODS Two hundred two patients were enrolled across 14 centers in Europe, Canada, and Australia. The following time intervals were measured: stroke onset to hospital arrival, hospital arrival to baseline imaging, baseline imaging to groin puncture, groin puncture to first stent deployment, and first stent deployment to reperfusion. Effects of time of day, general anesthesia use, and multimodal imaging on workflow were evaluated. Patient characteristics and workflow processes associated with prolonged interval times and good clinical outcome (90-day modified Rankin score, 0-2) were analyzed. RESULTS Median times were onset of stroke to hospital arrival, 123 minutes (interquartile range, 163 minutes); hospital arrival to thrombolysis in cerebral infarction (TICI) 2b/3 or final digital subtraction angiography, 133 minutes (interquartile range, 99 minutes); and baseline imaging to groin puncture, 86 minutes (interquartile range, 24 minutes). Time from baseline imaging to puncture was prolonged in patients receiving intravenous tissue-type plasminogen activator (32-minute mean delay) and when magnetic resonance-based imaging at baseline was used (18-minute mean delay). Extracranial carotid disease delayed puncture to first stent deployment time on average by 25 minutes. For each 1-hour increase in stroke onset to final digital subtraction angiography (or TICI 2b/3) time, odds of good clinical outcome decreased by 38%. CONCLUSIONS Interval times in the STAR study reflect current intra-arterial therapy for patients with acute ischemic stroke. Improving workflow metrics can further improve clinical outcome. CLINICAL TRIAL REGISTRATION: URL http://www.clinicaltrials.gov. Unique identifier: NCT01327989.