939 resultados para Digital analysis
Resumo:
This thesis is composed of three life-cycle analysis (LCA) studies of manufacturing to determine cumulative energy demand (CED) and greenhouse gas emissions (GHG). The methods proposed could reduce the environmental impact by reducing the CED in three manufacturing processes. First, industrial symbiosis is proposed and a LCA is performed on both conventional 1 GW-scaled hydrogenated amorphous silicon (a-Si:H)-based single junction and a-Si:H/microcrystalline-Si:H tandem cell solar PV manufacturing plants and such plants coupled to silane recycling plants. Using a recycling process that results in a silane loss of only 17 versus 85 percent, this results in a CED savings of 81,700 GJ and 290,000 GJ per year for single and tandem junction plants, respectively. This recycling process reduces the cost of raw silane by 68 percent, or approximately $22.6 and $79 million per year for a single and tandem 1 GW PV production facility, respectively. The results show environmental benefits of silane recycling centered around a-Si:H-based PV manufacturing plants. Second, an open-source self-replicating rapid prototype or 3-D printer, the RepRap, has the potential to reduce the environmental impact of manufacturing of polymer-based products, using distributed manufacturing paradigm, which is further minimized by the use of PV and improvements in PV manufacturing. Using 3-D printers for manufacturing provides the ability to ultra-customize products and to change fill composition, which increases material efficiency. An LCA was performed on three polymer-based products to determine the CED and GHG from conventional large-scale production and are compared to experimental measurements on a RepRap producing identical products with ABS and PLA. The results of this LCA study indicate that the CED of manufacturing polymer products can possibly be reduced using distributed manufacturing with existing 3-D printers under 89% fill and reduced even further with a solar photovoltaic system. The results indicate that the ability of RepRaps to vary fill has the potential to diminish environmental impact on many products. Third, one additional way to improve the environmental performance of this distributed manufacturing system is to create the polymer filament feedstock for 3-D printers using post-consumer plastic bottles. An LCA was performed on the recycling of high density polyethylene (HDPE) using the RecycleBot. The results of the LCA showed that distributed recycling has a lower CED than the best-case scenario used for centralized recycling. If this process is applied to the HDPE currently recycled in the U.S., more than 100 million MJ of energy could be conserved per annum along with significant reductions in GHG. This presents a novel path to a future of distributed manufacturing suited for both the developed and developing world with reduced environmental impact. From improving manufacturing in the photovoltaic industry with the use of recycling to recycling and manufacturing plastic products within our own homes, each step reduces the impact on the environment. The three coupled projects presented here show a clear potential to reduce the environmental impact of manufacturing and other processes by implementing complimenting systems, which have environmental benefits of their own in order to achieve a compounding effect of reduced CED and GHG.
Resumo:
Volcán Pacaya is one of three currently active volcanoes in Guatemala. Volcanic activity originates from the local tectonic subduction of the Cocos plate beneath the Caribbean plate along the Pacific Guatemalan coast. Pacaya is characterized by generally strombolian type activity with occasional larger vulcanian type eruptions approximately every ten years. One particularly large eruption occurred on May 27, 2010. Using GPS data collected for approximately 8 years before this eruption and data from an additional three years of collection afterwards, surface movement covering the period of the eruption can be measured and used as a tool to help understand activity at the volcano. Initial positions were obtained from raw data using the Automatic Precise Positioning Service provided by the NASA Jet Propulsion Laboratory. Forward modeling of observed 3-D displacements for three time periods (before, covering and after the May 2010 eruption) revealed that a plausible source for deformation is related to a vertical dike or planar surface trending NNW-SSE through the cone. For three distinct time periods the best fitting models describe deformation of the volcano: 0.45 right lateral movement and 0.55 m tensile opening along the dike mentioned above from October 2001 through January 2009 (pre-eruption); 0.55 m left lateral slip along the dike mentioned above for the period from January 2009 and January 2011 (covering the eruption); -0.025 m dip slip along the dike for the period from January 2011 through March 2013 (post-eruption). In all bestfit models the dike is oriented with a 75° westward dip. These data have respective RMS misfit values of 5.49 cm, 12.38 cm and 6.90 cm for each modeled period. During the time period that includes the eruption the volcano most likely experienced a combination of slip and inflation below the edifice which created a large scar at the surface down the northern flank of the volcano. All models that a dipping dike may be experiencing a combination of inflation and oblique slip below the edifice which augments the possibility of a westward collapse in the future.
Resumo:
Spectrographic analysis of limestones as a possible method of correlation of geologic formations is an altogether new line of investigation. As far as known the only previous work consists of a few analyses made by Fred Lines in his bachelor thesis work at Montana School of mines in the spring of 1942.
Resumo:
Stable Isotope Ratio Analysis (SIRA) is the measurement of variation in different isotopes of same elements in a material. This technique is well-established in the natural sciences and has been long part of the methodological arsenal in fields such as geology and biology. More recently this technique has begun to be utilized in the social sciences, moving from initial applications in anthropology to potential uses in geography, public health, forensic science, and others. This presentation will discuss the techniques behind SIRA, examples of current applications in the natural and social sciences, and potential avenues of future research.
Resumo:
Vegetation phenology is an important indicator of climate change and climate variability and it is strongly connected to biospheric–atmospheric gas exchange. We aimed to evaluate the applicability of phenological information derived from digital imagery for the interpretation of CO2 exchange measurements. For the years 2005–2007 we analyzed seasonal phenological development of 2 temperate mixed forests using tower-based imagery from standard RGB cameras. Phenological information was jointly analyzed with gross primary productivity (GPP) derived from net ecosystem exchange data. Automated image analysis provided reliable information on vegetation developmental stages of beech and ash trees covering all seasons. A phenological index derived from image color values was strongly correlated with GPP, with a significant mean time lag of several days for ash trees and several weeks for beech trees in early summer (May to mid-July). Leaf emergence dates for the dominant tree species partly explained temporal behaviour of spring GPP but were also masked by local meteorological conditions. We conclude that digital cameras at flux measurement sites not only provide an objective measure of the physiological state of a forest canopy at high temporal and spatial resolutions, but also complement CO2 and water exchange measurements, improving our knowledge of ecosystem processes.
Resumo:
Der Artikel behandelt das Projektieren der Produkt-Service-Verbindung vom Standpunkt der Informationsintegration aus. Der Autor erläutert grundlegende Unterschiede zwischen dem traditionellen und dem modernen Operationsmanagementkonzept. Ergänzend wird die Rolle der logistischen Unterstüzungsanalyse wird betrachtet. Der Artikel stellt das Konzept von CALS (Continuous Acquisition and Life cycle Support) dar, welches als Umgebung, die Datenverteilung zwischen den in den Entwicklungsprozess beteiligten Geschäftspartnern ermöglicht.
Resumo:
This article provides a legal and economic analysis of private copying levies in the EU, against the background of the Copyright Directive (2001/29), a number of recent rulings by the European Court of Justice and the recommendations presented by mediator Vitorino earlier this year. It concludes that notwithstanding these rulings and recommendations, there remains a lack of concordance on the relevance of contractual stipulations and digital rights management technologies (DRM) for setting levies, and the concept of harm. While Mr Vitorino and AG Sharpston (in the Opinion preceding VG Wort v. Kyocera) use different lines of reasoning to argue that levies raised on authorised copies would lead to double payment, the Court of Justice’s decision in VG Wort v. Kyocera seems to conclude that such copies should nonetheless be levied. If levies are to provide fair compensation for harm resulting from acts of private copying, economic analysis suggests one should distinguish between various kinds of private copies and take account of the extent to which the value said copies have for consumers can be priced into the purchase. Given the availability of DRM (including technical protection measures), the possibility of such indirect appropriation leads to the conclusion that the harm from most kinds of private copies is de minimis and gives no cause for levies. The user value of copies from unauthorised sources (e.g. from torrent networks or cyber lockers), on the other hand, cannot be appropriated indirectly by rightholders. It is, however, an open question in references for preliminary rulings pending at the Court of Justice whether these copies are included in the scope of the private copying exception or limitation and can thus be levied for. If they are not, as currently happens in several EU Member States, legal and economic analysis leads to the conclusion that the scope of private copying acts giving rise to harm susceptible of justifying levies is gradually diminishing.
Resumo:
In light of the recent European Court of Justice ruling (ECJ C-131/12, Google Spain v Spanish Data Protection Agency),the “right to be forgotten” has once again gained worldwide media attention. Already in 2012, whenthe European Commission proposed aright to be forgotten,this proposal received broad public interest and was debated intensively. Under certain conditions, individuals should thereby be able todelete personal data concerning them. More recently – in light of the European Parliament’s approval of the LIBE Committee’samendments onMarch 14, 2014 – the concept seems tobe close to its final form.Although it remains, for the most part,unchanged from the previously circulated drafts, it has beenre-labelled as a“right of erasure”. This article argues that, despite its catchy terminology, the right to be forgotten can be understood as a generic term, bringing together existing legal provisions: the substantial right of oblivion and the rather procedural right to erasure derived from data protection. Hereinafter, the article presents an analysis of selected national legal frameworks and corresponding case law, accounting for data protection, privacy, and general tort law as well as defamation law. This comparative analysis grasps the practical challenges which the attempt to strengthen individual control and informational self-determination faces. Consequently, it is argued that narrowing the focus on the data protection law amendments neglects the elaborate balancing of conflicting interests in European legal tradition. It is shown thatthe attemptto implement oblivion, erasure and forgetting in the digital age is a complex undertaking.
Resumo:
The purpose of the article is to provide first a doctrinal summary of the concept, rules and policy of exhaustion, first, on the international and EU level, and, later, under the law of the United States. Based upon this introduction, the paper turns to the analysis of the doctrine by the pioneer court decisions handed over in the UsedSoft, ReDigi, the German e-book/audio book cases, and the pending Tom Kabinet case from the Netherlands. Questions related to the licence versus sale dichotomy; the so-called umbrella solution; the “new copy theory”, migration of digital copies via the internet; the forward-and-delete technology; the issue of lex specialis and the theory of functional equivalence are covered later on. The author of the present article stresses that the answers given by the respective judges of the referred cases are not the final stop in the discussion. The UsedSoft preliminary ruling and the subsequent German domestic decisions highlight a special treatment for computer programs. On the other hand, the refusal of digital exhaustion in the ReDigi and the audio book/e-book cases might be in accordance with the present wording of copyright law; however, they do not necessarily reflect the proper trends of our ages. The paper takes the position that the need for digital exhaustion is constantly growing in society and amongst businesses. Indeed, there are reasonable arguments in favour of equalizing the resale of works sold in tangible and intangible format. Consequently, the paper urges the reconsideration of the norms on exhaustion on the international and EU level.
Resumo:
Nine Iowa State University veterinary medical students completed SPA records on herds from Iowa, North Dakota and South Dakota. The Iowa herds were included in the SPA summary for Iowa, but the six North and South Dakota herds were summarized separately. These six herds had an average herd size of 371 cows and had a financial return to capital, labor and management of $175 per cow. Total financial cost per cow averaged $286 for these herds with a range of $211 to $388. Feed utilized averaged 4,442 pounds of dry matter per cow and the average pounds of calf produced per exposed female was 506 pounds.
Resumo:
Stemmatology, or the reconstruction of the transmission history of texts, is a field that stands particularly to gain from digital methods. Many scholars already take stemmatic approaches that rely heavily on computational analysis of the collated text (e.g. Robinson and O’Hara 1996; Salemans 2000; Heikkilä 2005; Windram et al. 2008 among many others). Although there is great value in computationally assisted stemmatology, providing as it does a reproducible result and allowing access to the relevant methodological process in related fields such as evolutionary biology, computational stemmatics is not without its critics. The current state-of-the-art effectively forces scholars to choose between a preconceived judgment of the significance of textual differences (the Lachmannian or neo-Lachmannian approach, and the weighted phylogenetic approach) or to make no judgment at all (the unweighted phylogenetic approach). Some basis for judgment of the significance of variation is sorely needed for medieval text criticism in particular. By this, we mean that there is a need for a statistical empirical profile of the text-genealogical significance of the different sorts of variation in different sorts of medieval texts. The rules that apply to copies of Greek and Latin classics may not apply to copies of medieval Dutch story collections; the practices of copying authoritative texts such as the Bible will most likely have been different from the practices of copying the Lives of local saints and other commonly adapted texts. It is nevertheless imperative that we have a consistent, flexible, and analytically tractable model for capturing these phenomena of transmission. In this article, we present a computational model that captures most of the phenomena of text variation, and a method for analysis of one or more stemma hypotheses against the variation model. We apply this method to three ‘artificial traditions’ (i.e. texts copied under laboratory conditions by scholars to study the properties of text variation) and four genuine medieval traditions whose transmission history is known or deduced in varying degrees. Although our findings are necessarily limited by the small number of texts at our disposal, we demonstrate here some of the wide variety of calculations that can be made using our model. Certain of our results call sharply into question the utility of excluding ‘trivial’ variation such as orthographic and spelling changes from stemmatic analysis.
Resumo:
Aims: Arterial plaque rupture and thrombus characterise ST-elevation myocardial infarction (STEMI) and may aggravate delayed arterial healing following durable polymer drug-eluting stent (DP-DES) implantation. Biodegradable polymer (BP) may improve biocompatibility. We compared long-term outcomes in STEMI patients receiving BP-DES vs. durable polymer sirolimus-eluting stents (DP-SES). Methods and results: We pooled individual patient-level data from three randomised clinical trials (ISAR-TEST-3, ISAR-TEST-4 and LEADERS) comparing outcomes from BP-DES with DP-SES at four years. The primary endpoint (MACE) comprised cardiac death, MI, or target lesion revascularisation (TLR). Secondary endpoints were TLR, cardiac death or MI, and definite or probable stent thrombosis. Of 497 patients with STEMI, 291 received BP-DES and 206 DP-SES. At four years, MACE was significantly reduced following treatment with BP-DES (hazard ratio [HR] 0.59, 95% CI: 0.39-0.90; p=0.01) driven by reduced TLR (HR 0.54, 95% CI: 0.30-0.98; p=0.04). Trends towards reduction were seen for cardiac death or MI (HR 0.63, 95% CI: 0.37-1.05; p=0.07) and definite or probable stent thrombosis (3.6% vs. 7.1%; HR 0.49, 95% CI: 0.22-1.11; p=0.09). Conclusions: In STEMI, BP-DES demonstrated superior clinical outcomes to DP-SES at four years. Trends towards reduced cardiac death or myocardial infarction and reduced stent thrombosis require corroboration in specifically powered trials.
Resumo:
The paper seeks a re-conceptualization of the global digital divide debate. It critically explores the predominant notion, its evolution and measurement, as well as the policies that have been advanced to bridge the digital divide. Acknowledging the complexity of this inequality, the paper aims at analyzing the disparities beyond the connectivity and the skills barriers. Without understating the first two digital divides, it is argued that as the Internet becomes more sophisticated and more integrated into economic, social and cultural processes, a ‘third’ generation of divides becomes critical. These divides are drawn not at the entry to the net but within the net itself, and limit access to content. The increasing barriers to content, although of diverse nature, all relate to some governance characteristics inherent in cyberspace, such as global spillover of local decisions, regulation through code or proliferation of self- and co-regulatory models. It is maintained that as the practice of intervention intensifies in cyberspace, multiple and far-reaching points of control outside formal legal institutions are created, which threaten the availability of public goods and make the pursuit of public objectives difficult. This is an aspect that is rarely addressed in the global digital divide discussions, even in comprehensive analysis and political initiatives such as the World Summit on the Information Society. Yet, the conceptualization of the digital divide as impeded access to content may be key in terms of ensuring real participation and catering for the long-term implications of digital technologies.