909 resultados para Large-volume Quartz Latites
Resumo:
Large igneous provinces (LIPs) are sites of the most frequently recurring, largest volume basaltic and silicic eruptions in Earth history. These large-volume (N1000 km3 dense rock equivalent) and large-magnitude (NM8) eruptions produce areally extensive (104–105 km2) basaltic lava flow fields and silicic ignimbrites that are the main building blocks of LIPs. Available information on the largest eruptive units are primarily from the Columbia River and Deccan provinces for the dimensions of flood basalt eruptions, and the Paraná–Etendeka and Afro-Arabian provinces for the silicic ignimbrite eruptions. In addition, three large-volume (675– 2000 km3) silicic lava flows have also been mapped out in the Proterozoic Gawler Range province (Australia), an interpreted LIP remnant. Magma volumes of N1000 km3 have also been emplaced as high-level basaltic and rhyolitic sills in LIPs. The data sets indicate comparable eruption magnitudes between the basaltic and silicic eruptions, but due to considerable volumes residing as co-ignimbrite ash deposits, the current volume constraints for the silicic ignimbrite eruptions may be considerably underestimated. Magma composition thus appears to be no barrier to the volume of magma emitted during an individual eruption. Despite this general similarity in magnitude, flood basaltic and silicic eruptions are very different in terms of eruption style, duration, intensity, vent configuration, and emplacement style. Flood basaltic eruptions are dominantly effusive and Hawaiian–Strombolian in style, with magma discharge rates of ~106–108 kg s−1 and eruption durations estimated at years to tens of years that emplace dominantly compound pahoehoe lava flow fields. Effusive and fissural eruptions have also emplaced some large-volume silicic lavas, but discharge rates are unknown, and may be up to an order of magnitude greater than those of flood basalt lava eruptions for emplacement to be on realistic time scales (b10 years). Most silicic eruptions, however, are moderately to highly explosive, producing co-current pyroclastic fountains (rarely Plinian) with discharge rates of 109– 1011 kg s−1 that emplace welded to rheomorphic ignimbrites. At present, durations for the large-magnitude silicic eruptions are unconstrained; at discharge rates of 109 kg s−1, equivalent to the peak of the 1991 Mt Pinatubo eruption, the largest silicic eruptions would take many months to evacuate N5000 km3 of magma. The generally simple deposit structure is more suggestive of short-duration (hours to days) and high intensity (~1011 kg s−1) eruptions, perhaps with hiatuses in some cases. These extreme discharge rates would be facilitated by multiple point, fissure and/or ring fracture venting of magma. Eruption frequencies are much elevated for large-magnitude eruptions of both magma types during LIP-forming episodes. However, in basaltdominated provinces (continental and ocean basin flood basalt provinces, oceanic plateaus, volcanic rifted margins), large magnitude (NM8) basaltic eruptions have much shorter recurrence intervals of 103–104 years, whereas similar magnitude silicic eruptions may have recurrence intervals of up to 105 years. The Paraná– Etendeka province was the site of at least nine NM8 silicic eruptions over an ~1 Myr period at ~132 Ma; a similar eruption frequency, although with a fewer number of silicic eruptions is also observed for the Afro- Arabian Province. The huge volumes of basaltic and silicic magma erupted in quick succession during LIP events raises several unresolved issues in terms of locus of magma generation and storage (if any) in the crust prior to eruption, and paths and rates of ascent from magma reservoirs to the surface.
Resumo:
In today’s information society, electronic tools, such as computer networks for the rapid transfer of data and composite databases for information storage and management, are critical in ensuring effective environmental management. In particular environmental policies and programs for federal, state, and local governments need a large volume of up-to-date information on the quality of water, air, and soil in order to conserve and protect natural resources and to carry out meteorology. In line with this, the utilization of information and communication technologies (ICTs) is crucial to preserve and improve the quality of life. In handling tasks in the field of environmental protection a range of environmental and technical information is often required for a complex and mutual decision making in a multidisciplinary team environment. In this regard e-government provides a foundation of the transformative ICT initiative which can lead to better environmental governance, better services, and increased public participation in environmental decision- making process.
Resumo:
Variable Speed Limits (VSL) is an Intelligent Transportation Systems (ITS) control tool which can enhance traffic safety and which has the potential to contribute to traffic efficiency. Queensland's motorways experience a large volume of commuter traffic in peak periods, leading to heavy recurrent congestion and a high frequency of incidents. Consequently, Queensland's Department of Transport and Main Roads have considered deploying VSL to improve safety and efficiency. This paper identifies three types of VSL and three applicable conditions for activating VSL on for Queensland motorways: high flow, queuing and adverse weather. The design objectives and methodology for each condition are analysed, and micro-simulation results are presented to demonstrate the effectiveness of VSL.
Resumo:
Discovering proper search intents is a vi- tal process to return desired results. It is constantly a hot research topic regarding information retrieval in recent years. Existing methods are mainly limited by utilizing context-based mining, query expansion, and user profiling techniques, which are still suffering from the issue of ambiguity in search queries. In this pa- per, we introduce a novel ontology-based approach in terms of a world knowledge base in order to construct personalized ontologies for identifying adequate con- cept levels for matching user search intents. An iter- ative mining algorithm is designed for evaluating po- tential intents level by level until meeting the best re- sult. The propose-to-attempt approach is evaluated in a large volume RCV1 data set, and experimental results indicate a distinct improvement on top precision after compared with baseline models.
Resumo:
Across post-industrial societies worldwide, the creative industries are increasingly seen as a key economic driver. These industries - including fields as diverse as advertising, art, computer games, crafts, design, fashion, film, museums, music, performing arts, publishing, radio, theatre and TV - are built upon individual creativity and innovation and have the potential to create wealth and employment through the mechanism of intellectual property. Creative Industries: Critical Readings brings together the key writings - drawing on both journals and books - to present an authoritative and wide-ranging survey of this emerging field of study. The set is presented with an introduction and the writings are divided into four volumes, organized thematically: Volume 1: Concepts - focuses on the concept of creativity and the development of government and industry interest in creative industries; Volume 2: Economy - maps the role and function of creative industries in the economy at large; Volume 3: Organization - examines the ways in which creative institutions organize themselves; and Volume 4: Work - addresses issues of creative work, labour and careers This major reference work will be invaluable to scholars in economics, cultural studies, sociology, media studies and organization studies.
Resumo:
The feasibility of ex vivo blood production is limited by both biological and engineering challenges. From an engineering perspective, these challenges include the significant volumes required to generate even a single unit of a blood product, as well as the correspondingly high protein consumption required for such large volume cultures. Membrane bioreactors, such as hollow fiber bioreactors (HFBRs), enable cell densities approximately 100-fold greater than traditional culture systems and therefore may enable a significant reduction in culture working volumes. As cultured cells, and larger molecules, are retained within a fraction of the system volume, via a semipermeable membrane it may be possible to reduce protein consumption by limiting supplementation to only this fraction. Typically, HFBRs are complex perfusion systems having total volumes incompatible with bench scale screening and optimization of stem cell-based cultures. In this article we describe the use of a simplified HFBR system to assess the feasibility of this technology to produce blood products from umbilical cord blood-derived CD34+ hematopoietic stem progenitor cells (HSPCs). Unlike conventional HFBR systems used for protein manufacture, where cells are cultured in the extracapillary space, we have cultured cells in the intracapillary space, which is likely more compatible with the large-scale production of blood cell suspension cultures. Using this platform we direct HSPCs down the myeloid lineage, while targeting a 100-fold increase in cell density and the use of protein-free bulk medium. Our results demonstrate the potential of this system to deliver high cell densities, even in the absence of protein supplementation of the bulk medium.
Resumo:
Retrieving information from Twitter is always challenging due to its large volume, inconsistent writing and noise. Most existing information retrieval (IR) and text mining methods focus on term-based approach, but suffers from the problems of terms variation such as polysemy and synonymy. This problem deteriorates when such methods are applied on Twitter due to the length limit. Over the years, people have held the hypothesis that pattern-based methods should perform better than term-based methods as it provides more context, but limited studies have been conducted to support such hypothesis especially in Twitter. This paper presents an innovative framework to address the issue of performing IR in microblog. The proposed framework discover patterns in tweets as higher level feature to assign weight for low-level features (i.e. terms) based on their distributions in higher level features. We present the experiment results based on TREC11 microblog dataset and shows that our proposed approach significantly outperforms term-based methods Okapi BM25, TF-IDF and pattern based methods, using precision, recall and F measures.
Resumo:
It is a big challenge to find useful associations in databases for user specific needs. The essential issue is how to provide efficient methods for describing meaningful associations and pruning false discoveries or meaningless ones. One major obstacle is the overwhelmingly large volume of discovered patterns. This paper discusses an alternative approach called multi-tier granule mining to improve frequent association mining. Rather than using patterns, it uses granules to represent knowledge implicitly contained in databases. It also uses multi-tier structures and association mappings to represent association rules in terms of granules. Consequently, association rules can be quickly accessed and meaningless association rules can be justified according to the association mappings. Moreover, the proposed structure is also an precise compression of patterns which can restore the original supports. The experimental results shows that the proposed approach is promising.
Resumo:
Cardiovascular diseases are a leading cause of death throughout the developed world. With the demand for donor hearts far exceeding the supply, a bridge-to-transplant or permanent solution is required. This is currently achieved with ventricular assist devices (VADs), which can be used to assist the left ventricle (LVAD), right ventricle (RVAD), or both ventricles simultaneously (BiVAD). Earlier generation VADs were large, volume-displacement devices designed for temporary support until a donor heart was found. The latest generation of VADs use rotary blood pump technology which improves device lifetime and the quality of life for end stage heart failure patients. VADs are connected to the heart and greater vessels of the patient through specially designed tubes called cannulae. The inflow cannulae, which supply blood to the VAD, are usually attached to the left atrium or ventricle for LVAD support, and the right atrium or ventricle for RVAD support. Few studies have characterized the haemodynamic difference between the two cannulation sites, particularly with respect to rotary RVAD support. Inflow cannulae are usually made of metal or a semi-rigid polymer to prevent collapse with negative pressures. However suction, and subsequent collapse, of the cannulated heart chamber can be a frequent occurrence, particularly with the relatively preload insensitive rotary blood pumps. Suction events may be associated with endocardial damage, pump flow stoppages and ventricular arrhythmias. While several VAD control strategies are under development, these usually rely on potentially inaccurate sensors or somewhat unreliable inferred data to estimate preload. Fixation of the inflow cannula is usually achieved through suturing the cannula, often via a felt sewing ring, to the cannulated chamber. This technique extends the time on cardiopulmonary bypass which is associated with several postoperative complications. The overall objective of this thesis was to improve the placement and design of rotary LVAD and RVAD inflow cannulae to achieve enhanced haemodynamic performance, reduced incidence of suction events, reduced levels of postoperative bleeding and a faster implantation procedure. Specific objectives were: * in-vitro evaluation of LVAD and RVAD inflow cannula placement, * design and in-vitro evaluation of a passive mechanism to reduce the potential for heart chamber suction, * design and in-vitro evaluation of a novel suture-less cannula fixation device. In order to complete in-vitro evaluation of VAD inflow cannulae, a mock circulation loop (MCL) was developed to accurately replicate the haemodynamics in the human systemic and pulmonary circulations. Validation of the MCL’s haemodynamic performance, including the form and magnitude of pressure, flow and volume traces was completed through comparisons of patient data and the literature. The MCL was capable of reproducing almost any healthy or pathological condition, and provided a useful tool to evaluate VAD cannulation and other cardiovascular devices. The MCL was used to evaluate inflow cannula placement for rotary VAD support. Left and right atrial and ventricular cannulation sites were evaluated under conditions of mild and severe heart failure. With a view to long term LVAD support in the severe left heart failure condition, left ventricular inflow cannulation was preferred due to improved LVAD efficiency and reduced potential for thrombus formation. In the mild left heart failure condition, left atrial cannulation was preferred to provide an improved platform for myocardial recovery. Similar trends were observed with RVAD support, however to a lesser degree due to a smaller difference in right atrial and ventricular pressures. A compliant inflow cannula to prevent suction events was then developed and evaluated in the MCL. As rotary LVAD or RVAD preload was reduced, suction events occurred in all instances with a rigid inflow cannula. Addition of the compliant segment eliminated suction events in all instances. This was due to passive restriction of the compliant segment as preload dropped, thus increasing the VAD circuit resistance and decreasing the VAD flow rate. Therefore, the compliant inflow cannula acted as a passive flow control / anti-suction system in LVAD and RVAD support. A novel suture-less inflow cannula fixation device was then developed to reduce implantation time and postoperative bleeding. The fixation device was evaluated for LVAD and RVAD support in cadaveric animal and human hearts attached to a MCL. LVAD inflow cannulation was achieved in under two minutes with the suture-less fixation device. No leakage through the suture-less fixation device – myocardial interface was noted. Continued development and in-vivo evaluation of this device may result in an improved inflow cannulation technique with the potential for off-bypass insertion. Continued development of this research, in particular the compliant inflow cannula and suture-less inflow cannulation device, will result in improved postoperative outcomes, life span and quality of life for end-stage heart failure patients.
Resumo:
Mass flows on volcanic islands generated by volcanic lava dome collapse and by larger-volume flank collapse can be highly dangerous locally and may generate tsunamis that threaten a wider area. It is therefore important to understand their frequency, emplacement dynamics, and relationship to volcanic eruption cycles. The best record of mass flow on volcanic islands may be found offshore, where most material is deposited and where intervening hemipelagic sediment aids dating. Here we analyze what is arguably the most comprehensive sediment core data set collected offshore from a volcanic island. The cores are located southeast of Montserrat, on which the Soufriere Hills volcano has been erupting since 1995. The cores provide a record of mass flow events during the last 110 thousand years. Older mass flow deposits differ significantly from those generated by the repeated lava dome collapses observed since 1995. The oldest mass flow deposit originated through collapse of the basaltic South Soufriere Hills at 103-110 ka, some 20-30 ka after eruptions formed this volcanic center. A ∼1.8 km3 blocky debris avalanche deposit that extends from a chute in the island shelf records a particularly deep-seated failure. It likely formed from a collapse of almost equal amounts of volcanic edifice and coeval carbonate shelf, emplacing a mixed bioclastic-andesitic turbidite in a complex series of stages. This study illustrates how volcanic island growth and collapse involved extensive, large-volume submarine mass flows with highly variable composition. Runout turbidites indicate that mass flows are emplaced either in multiple stages or as single events.
Resumo:
A significant amount of speech data is required to develop a robust speaker verification system, but it is difficult to find enough development speech to match all expected conditions. In this paper we introduce a new approach to Gaussian probabilistic linear discriminant analysis (GPLDA) to estimate reliable model parameters as a linearly weighted model taking more input from the large volume of available telephone data and smaller proportional input from limited microphone data. In comparison to a traditional pooled training approach, where the GPLDA model is trained over both telephone and microphone speech, this linear-weighted GPLDA approach is shown to provide better EER and DCF performance in microphone and mixed conditions in both the NIST 2008 and NIST 2010 evaluation corpora. Based upon these results, we believe that linear-weighted GPLDA will provide a better approach than pooled GPLDA, allowing for the further improvement of GPLDA speaker verification in conditions with limited development data.
Resumo:
This research paper examines the potential of neighbourhood centres to generate and enhance social capital through their programs, activities, membership associations and community engagement. Social capital is a complex concept involving elements of norms, networks, and trust and is generally seen as enhancing community cohesion and the ability to attain common goals (outlined in more detail in Section 3). The aim of this research project is to describe the nature of social capital formation in terms of development and change in norms, networks and trust within the context of the operations of neighbourhood centres in three Queensland locations (i.e., Sherwood, Kingston/Slacks Creek, and Maleny). The study was prompted by surprisingly little research into how neighbourhood centres and their clients contribute to the development of social capital. Considering the large volume of research on the role of community organisations in building social capital, it is remarkable that perhaps the most obvious organisation with 'social capitalist' intentions has received so little attention (apart from Bullen and Onyx, 2005). Indeed, ostensibly, neighbourhood centres are all about social capital.
Resumo:
Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.
Resumo:
Recent data highlighted the association between penetration of antiretrovirals in the central nervous system (CNS) and neurocognitive impairment in HIVpositive patients. Existing antiretrovirals have been ranked according to a score of neuropenetration, which was shown to be a predictor of anti-HIVactivity in the CNS and improvement of neurocognitive disorders [1]. Main factors affecting drug penetration are known to be protein binding, lipophilicity and molecular weight [2]. Moreover, active translation by membrane transporters (such as p-glycoprotein) could be a key mechanism of passage [3]. The use of raltegravir (RGV), a novel antiretroviral drug targeted to inhibit the HIV preintegrase complex, is increasing worldwide due to its efficacy and tolerability. However, penetration of RGV in the CNS has not been yet elucidated. In fact, prediction of RGV neuropenetration according to molecular characteristics is controversial. Intermediate protein binding (83%) and large volume of distribution (273 l) could suggest a high distribution beyond extracellular spaces [4]. On the contrary, low lipophilicity (oil/water partition coefficient at pH 7.4 of 2.80) and intermediate molecular weight (482.51 Da) suggest a limited diffusion. Furthermore, in-vitro studies suggest that RGV is substrate of p-glycoprotein, although this efflux pump has not been identified to significantly affect plasma pharmacokinetics [5]. In any case, no data concerning RGV passage into cerebrospinal fluid of animals or humans have yet been published.
Resumo:
In this paper, we have compiled and reviewed the most recent literature, published from January2010 to December 2012, relating to the human exposure, environmental distribution, behaviour, fate and concentration time trends of polybrominated diphenyl ether (PBDE) and hexabromocyclododecane (HBCD) flame retardants, in order to establish their current trends and priorities for future study. Due to the large volume of literature included, we have provided full detail of the reviewed studies as Electronic Supplementary Information and here summarise the most relevant findings. Decreasing time trends for penta-mix PBDE congeners were seen for soils in northern Europe, sewage sludge in Sweden and the USA, carp from a US river, trout from three of the Great Lakes and in Arctic and UK marine mammals and many birds, but increasing time trends continue in Arctic polar bears and some birds at high trophic levels in northern Europe. This is a result of the time delay inherent in long-range atmospheric transport processes. In general, concentrations of BDE209 (the major component of the deca-mix PBDE product) are continuing to increase. Of major concern is the possible/likely debromination of the large reservoir of BDE209 in soils and sediments worldwide, to yield lower brominated congeners which are both more mobile and more toxic, and we have compiled the most recent evidence for the occurrence of this degradation process. Numerous studies reported here reinforce the importance o f this future concern. Time trends for HBCDs are mixed, with both increases and decreases evident in different matrices and locations and, notably, with increasing occurrence in birds of prey.