935 resultados para Topologically Massive Yang-Mills
Resumo:
Case law report - online
Resumo:
De-inking sludge is a waste product generated from secondary fibre paper mills who manufacture recycled paper into new paper sheets; it refers directly to the solid residues which evolve during the de-inking stage of the paper pulping process. The current practice for the disposal of this waste is either by land-spreading, land-filling or incineration which are unsustainable. This work has explored the intermediate pyrolysis of pre-conditioned de-inking sludge pellets in a recently patented 20 kg/h intermediate pyrolysis reactor (The Pyroformer). The reactor is essentially two co-axial screws which are configured in such a way as to circulate solids within the reactor and thus facilitate in the cracking of tars. The potential application of using the volatile organic vapours and permanent gases evolved would be to generate both combined heat and power (CHP) located at paper making sites. The results show that de-inking sludge could be successfully pyrolysed and the organic vapours produced were composed of a mixture of aromatic hydrocarbons, phenolic compounds and some fatty acid methyl esters as detected by liquid GC-MS. The calorific value of the oil after condensing was between 36 and 37 MJ/kg and the liquid fuel properties were also determined, permanent gases were detected by a GC-TCD and were composed of approximately 24% CO, 6% CH and 70% CO (v/v%). The solid residue from pyrolysis also contained a small residual calorific value, and was largely composed of mainly calcium based inert metal oxides. The application of applying intermediate pyrolysis to de-inking sludge for both CHP production and waste reduction is in principle a feasible technology which could be applied at secondary fibre paper mills. © 2013 Elsevier B.V. All rights reserved.
Resumo:
ransition P-systems are based on biological membranes and try to emulate cell behavior and its evolution due to the presence of chemical elements. These systems perform computation through transition between two consecutive configurations, which consist in a m-tuple of multisets present at any moment in the existing m regions of the system. Transition between two configurations is performed by using evolution rules also present in each region. Among main Transition P-systems characteristics are massive parallelism and non determinism. This work is part of a very large project and tries to determine the design of a hardware circuit that can improve remarkably the process involved in the evolution of a membrane. Process in biological cells has two different levels of parallelism: the first one, obviously, is the evolution of each cell inside the whole set, and the second one is the application of the rules inside one membrane. This paper presents an evolution of the work done previously and includes an improvement that uses massive parallelism to do transition between two states. To achieve this, the initial set of rules is transformed into a new set that consists in all their possible combinations, and each of them is treated like a new rule (participant antecedents are added to generate a new multiset), converting an unique rule application in a way of parallelism in the means that several rules are applied at the same time. In this paper, we present a circuit that is able to process this kind of rules and to decode the result, taking advantage of all the potential that hardware has to implement P Systems versus previously proposed sequential solutions.
Resumo:
2000 Mathematics Subject Classification: Primary 81R50, 16W50, 16S36, 16S37.
Resumo:
To evaluate the effectiveness of digital diabetic retinopathy screening in patients aged 90 years and over.MethodsThis is a retrospective analysis of 200 randomly selected patients eligible for diabetic retinopathy screening aged 90 years and over within the Birmingham, Solihull, and Black Country Screening Programme.ResultsOne hundred and seventy-nine (90%) patients attended screening at least once. 133 (74%) annual screening after their first screen, of whom 59% had no detectable diabetic retinopathy; 38 (21%) were referred for ophthalmology clinical assessment-36 for nondiabetic retinopathy reasons and two for diabetic maculopathy. Cataract accounted for 50% of all referrals for ophthalmology clinical assessment. Of the 133 patients placed on annual screening, 93 (70%) were screened at least once more. In terms of level of diabetic retinopathy, assessability or other ocular pathologies, 8 improved, 51 remained stable, and 31 deteriorated. Of the latter, 19 patients were referred for ophthalmology clinical assessment; none of these for diabetic retinopathy.ConclusionsScreening provides opportunistic identification of important nondiabetic retinopathy eye conditions. However, in view of the low identification rate of sight-threatening diabetic retinopathy in patients aged 90 years and over, and the current mission statement of the NHS Diabetic Eye Screening Programme, systematic annual diabetic retinopathy screening may not be justified in this age group of patients, but rather be performed in optometric practice.
Resumo:
The exponential growth of studies on the biological response to ocean acidification over the last few decades has generated a large amount of data. To facilitate data comparison, a data compilation hosted at the data publisher PANGAEA was initiated in 2008 and is updated on a regular basis (doi:10.1594/PANGAEA.149999). By January 2015, a total of 581 data sets (over 4 000 000 data points) from 539 papers had been archived. Here we present the developments of this data compilation five years since its first description by Nisumaa et al. (2010). Most of study sites from which data archived are still in the Northern Hemisphere and the number of archived data from studies from the Southern Hemisphere and polar oceans are still relatively low. Data from 60 studies that investigated the response of a mix of organisms or natural communities were all added after 2010, indicating a welcomed shift from the study of individual organisms to communities and ecosystems. The initial imbalance of considerably more data archived on calcification and primary production than on other processes has improved. There is also a clear tendency towards more data archived from multifactorial studies after 2010. For easier and more effective access to ocean acidification data, the ocean acidification community is strongly encouraged to contribute to the data archiving effort, and help develop standard vocabularies describing the variables and define best practices for archiving ocean acidification data.
Resumo:
The Ming deposit, Newfoundland Appalachians, is a metamorphosed (upper greenschist to lower amphibolite facies), Cambro-Ordovician, bimodalmafic volcanogenic massive sulfide (VMS) deposit that consists of several, spatially-associated, elongated orebodies composed of stratabound semimassive to massive sulfides and/or discordant sulfide stringers in a rhyodacitic footwall. Copper is the main commodity; however, the deposit contains precious metal-bearing zones with elevated Au grades. In this study, field observations, microscopy, and micro-analytical tools including electron microprobe, laser ablation inductively coupled plasma mass spectrometry, and secondary ion mass spectrometry were used to constrain the relative timing of precious metal emplacement, the physico-chemical conditions of hydrothermal fluid precipitation, and the sources of sulfur, precious metals, semi-metals and metals. The ore mineral assemblage is complex and indicates an intermediate sulfidation state. Pyrite and chalcopyrite are the dominant ore minerals with minor sphalerite and pyrrhotite, and trace galena, arsenopyrite and cubanite. Additional trace phases include tellurides, NiSb phases, sulfosalts, electrum, AgHg±Au alloys, and oxides. Silver phases and precious metals occur predominantly in semi-massive and massive sulfides as free grains, and as grains spatially associated with arsenopyrite and/or sulfosalts. Precious metal phases occurring between recrystallized pyrite and within cataclastic pyrite are rare. Hence, the complex ore assemblage and textures strongly suggest syngenetic precious metal emplacement, whereas metamorphism and deformation only internally and locally remobilized precious metal phases. The ore assemblage formed from reduced, acidic hydrothermal fluids over a range of temperatures (≈350 to below 260ºC). The abundance of telluride and Ag-bearing tetrahedrite, however, varies strongly between the different orebodies indicating variable ƒTe₂, ƒSe₂, mBi, and mSb within the hydrothermal fluids. The variations in the concentrations of semi-metals and metals (As, Bi, Hg, Sb, Se, Te), as well as Au and Ag, were due to variations in temperature but also to a likely contribution of magmatic fluids into the VMS hydrothermal system from presumably different geothermal reservoirs. Sulfur isotope studies indicate at least two sulfur sources: sulfur from thermochemically-reduced seawater sulfate and igneous sulfur. The source of igneous sulfur is the igneous footwall, direct magmatic fluid/volatiles, or both. Upper greenschist to lower amphibolite metamorphic conditions and deformation had no significant effect on the sulfur isotope composition of the sulfides at the Ming deposit.
Resumo:
The present research has character exploratory, bibliographic and qualitative. It is based in consolidated scientific arguments in cognitive theories inspired in constructivist method and, under this perspective proposes to develop a didactic guide oriented to students of courses MOOCs - Massive Open Online Courses that will make it possible to maximize the utilization and the assimilation of the knowledge available in these courses. Intends also prepare these students in practice of a methodology of storage that enables the knowledge acquired are not lost nor be forgotten over the course of time. The theoretical framework, based on the theories of Meaningful Learning (Ausubel), the Genetic Epistemology (Piaget), Socioconstructivist (Vigotsky) and the Multimedia Learning (Mayer), subsidizes the understanding of important concepts such as meaningful learning, previous knowledge, and conceptual maps. Supported by fundamental contribution of the Theory of Categories, which are inter-related to concepts applicable to teaching methodology supported by use of structured knowledge maps in the establishment of the binomial teaching-learning; and with valuable study performed by teachers Luciano Lima (UFU) and Rubens Barbosa Filho (UEMS) that culminated with the development of Exponential Effective Memorization Method in Binary Base (Double MEB).
Resumo:
The accretion of minor satellites has been postulated as the most likely mechanism to explain the significant size evolution of massive galaxies over cosmic time. Using a sample of 629 massive (M_star~ 10^11 M_⊙) galaxies from the near-infrared Palomar/DEEP-2 survey, we explore what fraction of these objects have satellites with 0.01 < M_sat/M_central < 1 (1:100) up to z= 1 and what fraction have satellites with 0.1 < M_sat/M_central < 1 (1:10) up to z= 2 within a projected radial distance of 100 kpc. We find that the fraction of massive galaxies with satellites, after background correction, remains basically constant and close to 30 per cent for satellites with a mass ratio down to 1:100 up to z= 1, and close to 15 per cent for satellites with a 1:10 mass ratio up to z= 2. The family of spheroid-like massive galaxies presents a 2–3 times larger fraction of objects with satellites than the group of disc-like massive galaxies. A crude estimation of the number of 1:3 mergers a massive spheroid-like galaxy has experienced since z~2 is around 2. For a disc-like galaxy this number decreases to ~1.
Resumo:
We present measurements of the mean mid-infrared to submillimetre flux densities of massive (M_*≳ 10^11 M_⊙) galaxies at redshifts 1.7 < z < 2.9, obtained by stacking positions of known objects taken from the GOODS NICMOS Survey (GNS) catalogue on maps at 24 μm (Spitzer/MIPS); 70, 100 and 160 μm (Herschel/PACS); 250, 350 and 500 μm (BLAST); and 870 μm (LABOCA). A modified blackbody spectrum fit to the stacked flux densities indicates a median [interquartile] star formation rate (SFR) of SFR = 63[48, 81] M_⊙ yr^−1. We note that not properly accounting for correlations between bands when fitting stacked data can significantly bias the result. The galaxies are divided into two groups, disc-like and spheroid-like, according to their Sérsic indices, n. We find evidence that most of the star formation is occurring in n≤ 2 (disc-like) galaxies, with median [interquartile] SFR = 122[100, 150] M_⊙ yr^−1, while there are indications that the n > 2 (spheroid-like) population may be forming stars at a median [interquartile] SFR = 14[9, 20] M_⊙ yr^−1, if at all. Finally, we show that star formation is a plausible mechanism for size evolution in this population as a whole, but find only marginal evidence that it is what drives the expansion of the spheroid-like galaxies.
Resumo:
Durante el desarrollo del proyecto he aprendido sobre Big Data, Android y MongoDB mientras que ayudaba a desarrollar un sistema para la predicción de las crisis del trastorno bipolar mediante el análisis masivo de información de diversas fuentes. En concreto hice una parte teórica sobre bases de datos NoSQL, Streaming Spark y Redes Neuronales y después diseñé y configuré una base de datos MongoDB para el proyecto del trastorno bipolar. También aprendí sobre Android y diseñé y desarrollé una aplicación de móvil en Android para recoger datos para usarlos como entrada en el sistema de predicción de crisis. Una vez terminado el desarrollo de la aplicación también llevé a cabo una evaluación con usuarios.
Resumo:
Uncertainty quantification (UQ) is both an old and new concept. The current novelty lies in the interactions and synthesis of mathematical models, computer experiments, statistics, field/real experiments, and probability theory, with a particular emphasize on the large-scale simulations by computer models. The challenges not only come from the complication of scientific questions, but also from the size of the information. It is the focus in this thesis to provide statistical models that are scalable to massive data produced in computer experiments and real experiments, through fast and robust statistical inference.
Chapter 2 provides a practical approach for simultaneously emulating/approximating massive number of functions, with the application on hazard quantification of Soufri\`{e}re Hills volcano in Montserrate island. Chapter 3 discusses another problem with massive data, in which the number of observations of a function is large. An exact algorithm that is linear in time is developed for the problem of interpolation of Methylation levels. Chapter 4 and Chapter 5 are both about the robust inference of the models. Chapter 4 provides a new criteria robustness parameter estimation criteria and several ways of inference have been shown to satisfy such criteria. Chapter 5 develops a new prior that satisfies some more criteria and is thus proposed to use in practice.
Resumo:
Periods of drought and low streamflow can have profound impacts on both human and natural systems. People depend on a reliable source of water for numerous reasons including potable water supply and to produce economic value through agriculture or energy production. Aquatic ecosystems depend on water in addition to the economic benefits they provide to society through ecosystem services. Given that periods of low streamflow may become more extreme and frequent in the future, it is important to study the factors that control water availability during these times. In the absence of precipitation the slower hydrological response of groundwater systems will play an amplified role in water supply. Understanding the variability of the fraction of streamflow contribution from baseflow or groundwater during periods of drought provides insight into what future water availability may look like and how it can best be managed. The Mills River Basin in North Carolina is chosen as a case-study to test this understanding. First, obtaining a physically meaningful estimation of baseflow from USGS streamflow data via computerized hydrograph analysis techniques is carried out. Then applying a method of time series analysis including wavelet analysis can highlight signals of non-stationarity and evaluate the changes in variance required to better understand the natural variability of baseflow and low flows. In addition to natural variability, human influence must be taken into account in order to accurately assess how the combined system reacts to periods of low flow. Defining a combined demand that consists of both natural and human demand allows us to be more rigorous in assessing the level of sustainable use of a shared resource, in this case water. The analysis of baseflow variability can differ based on regional location and local hydrogeology, but it was found that baseflow varies from multiyear scales such as those associated with ENSO (3.5, 7 years) up to multi decadal time scales, but with most of the contributing variance coming from decadal or multiyear scales. It was also found that the behavior of baseflow and subsequently water availability depends a great deal on overall precipitation, the tracks of hurricanes or tropical storms and associated climate indices, as well as physiography and hydrogeology. Evaluating and utilizing the Duke Combined Hydrology Model (DCHM), reasonably accurate estimates of streamflow during periods of low flow were obtained in part due to the model’s ability to capture subsurface processes. Being able to accurately simulate streamflow levels and subsurface interactions during periods of drought can be very valuable to water suppliers, decision makers, and ultimately impact citizens. Knowledge of future droughts and periods of low flow in addition to tracking customer demand will allow for better management practices on the part of water suppliers such as knowing when they should withdraw more water during a surplus so that the level of stress on the system is minimized when there is not ample water supply.
Resumo:
Since the implementation of the Programa Conectar Igualdad (PCI) (Connecting Equality Program) in 2010 in Argentina, numerous Social Science specialists started to research how massive ICT introduction in schools would radically affect teaching and learning processes, knowledge building and youth behaviour. Nevertheless, there is still not much empirical evidence showing the ways in which these technologies are appropriated. This situation discloses the need of placing research questions locally situated with regard to those potential changes. What existing access methods does PCI encounter? And how does its implementation participate in the design of personal and family heterogeneous trajectories of ICTs appropriation? How do the students themselves perceive the infl uence of PCI on their own technologic abilities and competence? How do knowledge and aptitudes associated to new digital media articulate with the knowledge manners promoted by the school format and institutionalism? How does the massive introduction of netbooks affect the interaction among different school actors (students-teachers)? What happens in other sociability and socialization spaces, such as the house and cybercafé?
Resumo:
Herschel Island in the southern Beaufort Sea is a push moraine at the northwestern-most limit of the Laurentide Ice Sheet. Stable water isotope (d18O, dD) and hydrochemical studies were applied to two tabular massive ground ice bodies to unravel their genetic origin. Buried glacier ice or basal regelation ice was encountered beneath an ice-rich diamicton with strong glaciotectonic deformation structures. The massive ice isotopic composition was highly depleted in heavy isotopes (mean d18O: -33 per mil; mean dD: -258 per mil), suggesting full-glacial conditions during ice formation. Other massive ice of unknown origin with a very large d18O range (from -39 to -21 per mil) was found adjacent to large, striated boulders. A clear freezing slope was present with progressive depletion in heavy isotopes towards the centre of the ice body. Fractionation must have taken place during closed-system freezing, possibly of a glacial meltwater pond. Both massive ground ice bodies exhibited a mixed ion composition suggestive of terrestrial waters with a marine influence. Hydrochemical signatures resemble the Herschel Island sediments that are derived from nearshore marine deposits upthrust by the Laurentide ice. A prolonged contact between water feeding the ice bodies and the surrounding sediment is therefore inferred.