940 resultados para down-hole clean up


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to their unobtrusive nature, vision-based approaches to tracking sports players have been preferred over wearable sensors as they do not require the players to be instrumented for each match. Unfortunately however, due to the heavy occlusion between players, variation in resolution and pose, in addition to fluctuating illumination conditions, tracking players continuously is still an unsolved vision problem. For tasks like clustering and retrieval, having noisy data (i.e. missing and false player detections) is problematic as it generates discontinuities in the input data stream. One method of circumventing this issue is to use an occupancy map, where the field is discretised into a series of zones and a count of player detections in each zone is obtained. A series of frames can then be concatenated to represent a set-play or example of team behaviour. A problem with this approach though is that the compressibility is low (i.e. the variability in the feature space is incredibly high). In this paper, we propose the use of a bilinear spatiotemporal basis model using a role representation to clean-up the noisy detections which operates in a low-dimensional space. To evaluate our approach, we used a fully instrumented field-hockey pitch with 8 fixed high-definition (HD) cameras and evaluated our approach on approximately 200,000 frames of data from a state-of-the-art real-time player detector and compare it to manually labeled data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A staged crime scene involves deliberate alteration of evidence by the offender to simulate events that did not occur for the purpose of misleading authorities (Geberth, 2006; Turvey, 2000). This study examined 115 staged homicides from the USA to determine common elements; victim and perpetrator characteristics; and specific features of different types of staged scenes. General characteristics include: multiple victims and offenders; a previous relationship be- tween parties involved; and victims discovered in their own home, often by the offender. Staged scenes were separated by type with staged burglaries, suicides, accidents, and car accidents examined in more detail. Each type of scene displays differently with separate indicators and common features. Features of staged burglaries were: no points of entry/exit staged; non-valuables taken; scene ransacking; offender self- injury; and offenders bringing weapons to the scene. Features of staged suicides included: weapon arrangement and simulating self-injury to the victim; rearranging the body; and removing valuables. Examples of elements of staged accidents were arranging the implement/weapon and re- positioning the deceased; while staged car accidents involved: transporting the body to the vehicle and arranging both; mutilation after death; attempts to secure an alibi; and clean up at the primary crime scene. The results suggest few staging behaviors are used, despite the credibility they may have offered the façade. This is the first peer-reviewed, published study to examine the specific features of these scenes, and is the largest sample studied to date.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This project investigated which aspects of being flooded most affected mental health outcomes. It found that stress in the aftermath of the flood, during the clean-up and rebuilding phase, including stress due to difficulties with insurance companies, was a previously overlooked risk factor, and social support and sense of belonging were the strongest protective factors. Implications for community recovery following disasters include providing effective targeting of support services throughout the lengthy rebuilding phase; the need to co-ordinate tradespeople; and training for insurance company staff aimed at minimising the incidence of insurance company staff inadvertently adding to disaster victims' stress.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Space allowance is a major factor influencing animal welfare. For livestock, at least, it plays a critical role in profitability, yet there is little information on the amount of space that animals require. The amount of space an animal occupies as a consequence of its shape and size can be estimated using allometry; linear dimensions (L) can be expressed as L = kW1/3 and surface area (S) as S = kW2/3, where k = a constant and W = the weight of the animal. Such equations have been used to determine the amount of space needed by standing (area [m2] = 0.019W0.66) and lying (area [m2] = 0.027W0.67) animals. Limited studies on the lying down and standing up behaviors of pigs and cattle suggest that the amount of space required can be estimated by area (m2) = 0.047W0.66. Linear space required per animal for behaviors such as feeding or drinking from a trough can be estimated from 0.064W0.33, but in groups this requirement will be affected by social interactions among group members and the amount of competition for the resource. Determining the amount of space for groups of animals is complex, as the amount of useable space can vary with group size and by how group members share space in time. Some studies have been conducted on the way in which groups of domestic fowl use space, but overall, we know very little about the ways in which livestock time-share space, synchronicity in the performance of behaviors, and the effects of spatial restrictions on behavior and welfare.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Distraction in the workplace is increasingly more common in the information age. Several tasks and sources of information compete for a worker's limited cognitive capacities in human-computer interaction (HCI). In some situations even very brief interruptions can have detrimental effects on memory. Nevertheless, in other situations where persons are continuously interrupted, virtually no interruption costs emerge. This dissertation attempts to reveal the mental conditions and causalities differentiating the two outcomes. The explanation, building on the theory of long-term working memory (LTWM; Ericsson and Kintsch, 1995), focuses on the active, skillful aspects of human cognition that enable the storage of task information beyond the temporary and unstable storage provided by short-term working memory (STWM). Its key postulate is called a retrieval structure an abstract, hierarchical knowledge representation built into long-term memory that can be utilized to encode, update, and retrieve products of cognitive processes carried out during skilled task performance. If certain criteria of practice and task processing are met, LTWM allows for the storage of large representations for long time periods, yet these representations can be accessed with the accuracy, reliability, and speed typical of STWM. The main thesis of the dissertation is that the ability to endure interruptions depends on the efficiency in which LTWM can be recruited for maintaing information. An observational study and a field experiment provide ecological evidence for this thesis. Mobile users were found to be able to carry out heavy interleaving and sequencing of tasks while interacting, and they exhibited several intricate time-sharing strategies to orchestrate interruptions in a way sensitive to both external and internal demands. Interruptions are inevitable, because they arise as natural consequences of the top-down and bottom-up control of multitasking. In this process the function of LTWM is to keep some representations ready for reactivation and others in a more passive state to prevent interference. The psychological reality of the main thesis received confirmatory evidence in a series of laboratory experiments. They indicate that after encoding into LTWM, task representations are safeguarded from interruptions, regardless of their intensity, complexity, or pacing. However, when LTWM cannot be deployed, the problems posed by interference in long-term memory and the limited capacity of the STWM surface. A major contribution of the dissertation is the analysis of when users must resort to poorer maintenance strategies, like temporal cues and STWM-based rehearsal. First, one experiment showed that task orientations can be associated with radically different patterns of retrieval cue encodings. Thus the nature of the processing of the interface determines which features will be available as retrieval cues and which must be maintained by other means. In another study it was demonstrated that if the speed of encoding into LTWM, a skill-dependent parameter, is slower than the processing speed allowed for by the task, interruption costs emerge. Contrary to the predictions of competing theories, these costs turned out to involve intrusions in addition to omissions. Finally, it was learned that in rapid visually oriented interaction, perceptual-procedural expectations guide task resumption, and neither STWM nor LTWM are utilized due to the fact that access is too slow. These findings imply a change in thinking about the design of interfaces. Several novel principles of design are presented, basing on the idea of supporting the deployment of LTWM in the main task.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When genome sections of wild Solanum species are bred into the cultivated potato (S. tuberosum L.) to obtain improved potato cultivars, the new cultivars must be evaluated for their beneficial and undesirable traits. Glycoalkaloids present in Solanum species are known for their toxic as well as for beneficial effects on mammals. On the other hand, glycoalkaloids in potato leaves provide natural protection against pests. Due to breeding, glycoalkaloid profile of the plant is affected. In addition, the starch properties in potato tubers can be affected as a result of breeding, because the crystalline properties are determined by the botanical source of the starch. Starch content and composition affect the texture of cooked and processed potatoes. In order to determine glycoalkaloid contents in Solanum species, simultaneous separation of glycoalkaloids and aglycones using reversed-phase high-performance liquid chromatography (HPLC) was developed. Clean-up of foliage samples was improved using a silica-based strong cation exchanger instead of octadecyl phases in solid-phase extraction. Glycoalkaloids alpha-solanine and alpha-chaconine were detected in potato tubers of cvs. Satu and Sini. The total glycoalkaloid concentration of non-peeled and immature tubers was at an acceptable level (under 20 mg/100 g of FW) in the cv. Satu, whereas concentration in cv. Sini was 23 mg/100 g FW. Solanum species (S. tuberosum, S. brevidens, S. acaule, and S. commersonii) and interspecific somatic hybrids (brd + tbr, acl + tbr, cmm + tbr) were analyzed for their glycoalkaloid contents using liquid chromatography-electrospray ionization-mass spectrometry (LC-ESI-MS). The concentrations in the tubers of the brd + tbr and acl + tbr hybrids remained under 20 mg/100 g FW. Glycoalkaloid concentration in the foliage of the Solanum species was between 110 mg and 890 mg/100 g FW. However, the concentration in the foliage of S. acaule was as low as 26 mg/100 g FW. The total concentrations of brd + tbr, acl + tbr, and cmm + tbr hybrid foliages were 88 mg, 180 mg, and 685 mg/100 g FW, respectively. Glycoalkaloids of both parental plants as well as new combinations of aglycones and saccharides were detected in somatic hybrids. The hybrids contained mainly spirosolanes, and glycoalkaloid structures having no 5,6-double bond in the aglycone. Based on these results, the glycoalkaloid profiles of the hybrids may represent a safer and more beneficial spectrum of glycoalkaloids than that found in currently cultivated varieties. Starch nanostructure of three different cultivars (Satu, Saturna, and Lady Rosetta), a wild species S. acaule, and interspecific somatic hybrids were examined by wide-angle and small-angle X-ray scattering (WAXS, SAXS). For the first time, the measurements were conducted on fresh potato tuber samples. Crystallinity of starch, average crystallite size, and lamellar distance were determined from the X-ray patterns. No differences in the starch nanostructure between the three different cultivars were detected. However, tuber immaturity was detected by X-ray scattering methods when large numbers of immature and mature samples were measured and the results were compared. The present study shows that no significant changes occurred in the nanostructures of starches resulting from hybridizations of potato cultivars.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Comprehensive two-dimensional gas chromatography (GC×GC) offers enhanced separation efficiency, reliability in qualitative and quantitative analysis, capability to detect low quantities, and information on the whole sample and its components. These features are essential in the analysis of complex samples, in which the number of compounds may be large or the analytes of interest are present at trace level. This study involved the development of instrumentation, data analysis programs and methodologies for GC×GC and their application in studies on qualitative and quantitative aspects of GC×GC analysis. Environmental samples were used as model samples. Instrumental development comprised the construction of three versions of a semi-rotating cryogenic modulator in which modulation was based on two-step cryogenic trapping with continuously flowing carbon dioxide as coolant. Two-step trapping was achieved by rotating the nozzle spraying the carbon dioxide with a motor. The fastest rotation and highest modulation frequency were achieved with a permanent magnetic motor, and modulation was most accurate when the motor was controlled with a microcontroller containing a quartz crystal. Heated wire resistors were unnecessary for the desorption step when liquid carbon dioxide was used as coolant. With use of the modulators developed in this study, the narrowest peaks were 75 ms at base. Three data analysis programs were developed allowing basic, comparison and identification operations. Basic operations enabled the visualisation of two-dimensional plots and the determination of retention times, peak heights and volumes. The overlaying feature in the comparison program allowed easy comparison of 2D plots. An automated identification procedure based on mass spectra and retention parameters allowed the qualitative analysis of data obtained by GC×GC and time-of-flight mass spectrometry. In the methodological development, sample preparation (extraction and clean-up) and GC×GC methods were developed for the analysis of atmospheric aerosol and sediment samples. Dynamic sonication assisted extraction was well suited for atmospheric aerosols collected on a filter. A clean-up procedure utilising normal phase liquid chromatography with ultra violet detection worked well in the removal of aliphatic hydrocarbons from a sediment extract. GC×GC with flame ionisation detection or quadrupole mass spectrometry provided good reliability in the qualitative analysis of target analytes. However, GC×GC with time-of-flight mass spectrometry was needed in the analysis of unknowns. The automated identification procedure that was developed was efficient in the analysis of large data files, but manual search and analyst knowledge are invaluable as well. Quantitative analysis was examined in terms of calibration procedures and the effect of matrix compounds on GC×GC separation. In addition to calibration in GC×GC with summed peak areas or peak volumes, simplified area calibration based on normal GC signal can be used to quantify compounds in samples analysed by GC×GC so long as certain qualitative and quantitative prerequisites are met. In a study of the effect of matrix compounds on GC×GC separation, it was shown that quality of the separation of PAHs is not significantly disturbed by the amount of matrix and quantitativeness suffers only slightly in the presence of matrix and when the amount of target compounds is low. The benefits of GC×GC in the analysis of complex samples easily overcome some minor drawbacks of the technique. The developed instrumentation and methodologies performed well for environmental samples, but they could also be applied for other complex samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The records reflect the organizational structure of the Jewish ghetto administration and consist of the following: Correspondence with German government agencies, 1939-1941, including the Police and Gestapo, the *Oberburgermeister* of Litzmannstadt (German name for Lodz), the *Gettoverwaltung* (German administration of the ghetto). The correspondence pertains to the establishment of the ghetto, expropriation of Jewish property, resettlement of Lodz Jews into the ghetto, sanitary conditions, ghetto industry, anti-Jewish ordinances. Announcements issued by Rumkowski, 1940-1944. A complete set of daily communications to the ghetto population on all subjects pertinent to ghetto life such as: confiscations of Jewish property, food rationing, availability of work, relief distribution, deportations, liquidation of the ghetto. Files of various departments of the Jewish ghetto administration including labor divisions and workshops, the Jewish police (*Ordnungsdienst*), Statistics Department, Ghetto Court, Archives, Resettlement Department, Deportation Commission. Of special interest are the Archives files which contain essays and reports written by the Archives staff expressly for the purpose of historical record on subjects related to ghetto life. Outstanding in this group are reports and literary sketches by Joseph Zelkowicz, including his extensive account about the *Gesperre* (Yid. Shpere) - the deportation of the children, the old and the infirm in September, 1942. In addition, the Archives files contain bulletins of the *Daily Chronicle* of the Lodz Ghetto, transcripts of speeches by Rumkowski, and issues of the *Geto-tsaytung*, a short-lived official publication of the Eldest of the Jews. Iconographic materials, including photographs and albums. The photographs taken by Mendel Grossman, Henryk Ross, Maliniak, Zonabend and others, provide an extensive visual record of ghetto life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The collection contains more than 60 black and white photographs from the first decades of the 20th century found in the synagogue of Mediaş (Mediasch, Medgyes), Romania. The photographs were found in the process of an on-going clean-up and restoration project and for the most part are unidentified. The photographs are of community members and their relatives and friends; they consist of group family portraits, individual portraits, babies, and children. Some of the photographs originate from Mediaş and other nearby Transylvanian towns, while others were printed by foreign printing shops and were presumably sent to relatives living in Mediaş.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Organochlorine pesticides (OCPs) are ubiquitous environmental contaminants with adverse impacts on aquatic biota, wildlife and human health even at low concentrations. However, conventional methods for their determination in river sediments are resource intensive. This paper presents an approach that is rapid and also reliable for the detection of OCPs. Accelerated Solvent Extraction (ASE) with in-cell silica gel clean-up followed by Triple Quadrupole Gas Chromatograph Mass Spectrometry (GCMS/MS) was used to recover OCPs from sediment samples. Variables such as temperature, solvent ratio, adsorbent mass and extraction cycle were evaluated and optimised for the extraction. With the exception of Aldrin, which was unaffected by any of the variables evaluated, the recovery of OCPs from sediment samples was largely influenced by solvent ratio and adsorbent mass and, to some extent, the number of cycles and temperature. The optimised conditions for OCPs extraction in sediment with good recoveries were determined to be 4 cycles, 4.5 g of silica gel, 105 ᴼC, and 4:3 v/v DCM: hexane mixture. With the exception of two compounds (α-BHC and Aldrin) whose recoveries were low (59.73 and 47.66 % respectively), the recovery of the other pesticides were in the range 85.35 – 117.97% with precision < 10 % RSD. The method developed significantly reduces sample preparation time, the amount of solvent used, matrix interference, and is highly sensitive and selective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bioremediation, which is the exploitation of the intrinsic ability of environmental microbes to degrade and remove harmful compounds from nature, is considered to be an environmentally sustainable and cost-effective means for environmental clean-up. However, a comprehensive understanding of the biodegradation potential of microbial communities and their response to decontamination measures is required for the effective management of bioremediation processes. In this thesis, the potential to use hydrocarbon-degradative genes as indicators of aerobic hydrocarbon biodegradation was investigated. Small-scale functional gene macro- and microarrays targeting aliphatic, monoaromatic and low molecular weight polyaromatic hydrocarbon biodegradation were developed in order to simultaneously monitor the biodegradation of mixtures of hydrocarbons. The validity of the array analysis in monitoring hydrocarbon biodegradation was evaluated in microcosm studies and field-scale bioremediation processes by comparing the hybridization signal intensities to hydrocarbon mineralization, real-time polymerase chain reaction (PCR), dot blot hybridization and both chemical and microbiological monitoring data. The results obtained by real-time PCR, dot blot hybridization and gene array analysis were in good agreement with hydrocarbon biodegradation in laboratory-scale microcosms. Mineralization of several hydrocarbons could be monitored simultaneously using gene array analysis. In the field-scale bioremediation processes, the detection and enumeration of hydrocarbon-degradative genes provided important additional information for process optimization and design. In creosote-contaminated groundwater, gene array analysis demonstrated that the aerobic biodegradation potential that was present at the site, but restrained under the oxygen-limited conditions, could be successfully stimulated with aeration and nutrient infiltration. During ex situ bioremediation of diesel oil- and lubrication oil-contaminated soil, the functional gene array analysis revealed inefficient hydrocarbon biodegradation, caused by poor aeration during composting. The functional gene array specifically detected upper and lower biodegradation pathways required for complete mineralization of hydrocarbons. Bacteria representing 1 % of the microbial community could be detected without prior PCR amplification. Molecular biological monitoring methods based on functional genes provide powerful tools for the development of more efficient remediation processes. The parallel detection of several functional genes using functional gene array analysis is an especially promising tool for monitoring the biodegradation of mixtures of hydrocarbons.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A large fraction of an XML document typically consists of text data. The XPath query language allows text search via the equal, contains, and starts-with predicates. Such predicates can be efficiently implemented using a compressed self-index of the document's text nodes. Most queries, however, contain some parts querying the text of the document, plus some parts querying the tree structure. It is therefore a challenge to choose an appropriate evaluation order for a given query, which optimally leverages the execution speeds of the text and tree indexes. Here the SXSI system is introduced. It stores the tree structure of an XML document using a bit array of opening and closing brackets plus a sequence of labels, and stores the text nodes of the document using a global compressed self-index. On top of these indexes sits an XPath query engine that is based on tree automata. The engine uses fast counting queries of the text index in order to dynamically determine whether to evaluate top-down or bottom-up with respect to the tree structure. The resulting system has several advantages over existing systems: (1) on pure tree queries (without text search) such as the XPathMark queries, the SXSI system performs on par or better than the fastest known systems MonetDB and Qizx, (2) on queries that use text search, SXSI outperforms the existing systems by 1-3 orders of magnitude (depending on the size of the result set), and (3) with respect to memory consumption, SXSI outperforms all other systems for counting-only queries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mycotoxins are secondary metabolites of filamentous fungi. They pose a health risk to humans and animals due to their harmful biological properties and common occurrence in food and feed. Liquid chromatography/mass spectrometry (LC/MS) has gained popularity in the trace analysis of food contaminants. In this study, the applicability of the technique was evaluated in multi-residue methods of mycotoxins aiming at simultaneous detection of chemically diverse compounds. Methods were developed for rapid determination of toxins produced by fungal genera of Aspergillus, Fusarium, Penicillium and Claviceps from cheese, cereal based agar matrices and grains. Analytes were extracted from these matrices with organic solvents. Minimal sample clean-up was carried out before the analysis of the mycotoxins with reversed phase LC coupled to tandem MS (MS/MS). The methods were validated and applied for investigating mycotoxins in cheese and ergot alkaloid occurrence in Finnish grains. Additionally, the toxin production of two Fusarium species predominant in northern Europe was studied. Nine mycotoxins could be determined from cheese with the method developed. The limits of quantification (LOQ) allowed the quantification at concentrations varying from 0.6 to 5.0 µg/kg. The recoveries ranged between 96 and 143 %, and the within-day repeatability (as relative standard deviation, RSDr) between 2.3 and 12.1 %. Roquefortine C and mycophenolic acid could be detected at levels of 300 up to 12000 µg/kg in the mould cheese samples analysed. A total of 29 or 31 toxins could be analysed with the method developed for agar matrices and grains, with the LOQs ranging overall from 0.1 to 1250 µg/kg. The recoveries ranged generally between 44 and 139 %, and the RSDr between 2.0 and 38 %. Type-A trichothecenes and beauvericin were determined from the cereal based agar and grain cultures of F. sporotrichioides and F. langsethiae. T-2 toxin was the main metabolite, the average levels reaching 22000 µg/kg in the grain cultures after 28 days of incubation. The method developed for ten ergot alkaloids from grains allowed their quantification at levels varying from 0.01 to 10 µg/kg. The recoveries ranged from 51 to 139 %, and the RSDr from 0.6 to 13.9 %. Ergot alkaloids were measured in barley and rye at average levels of 59 and 720 µg/kg, respectively. The two most prevalent alkaloids were ergocornine and ergocristine. The LC/MS methods developed enabled rapid detection of mycotoxins in such applications where several toxins co-occurred. Generally, the performance of the methods was good, allowing reliable analysis of the mycotoxins of interest with sufficiently low quantification limits. However, the variation in validation results highlighted the challenges related to optimising this type of multi-residue methods. New data was obtained about the occurrence of mycotoxins in mould cheeses and of ergot alkaloids in Finnish grains. In addition, the study revealed the high mycotoxin-producing potential of two common fungi in Finnish crops. The information can be useful when risks related to fungal and mycotoxin contamination will be assessed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two atmospheric inversions (one fine-resolved and one process-discriminating) and a process-based model for land surface exchanges are brought together to analyse the variations of methane emissions from 1990 to 2009. A focus is put on the role of natural wetlands and on the years 2000-2006, a period of stable atmospheric concentrations. From 1990 to 2000, the top-down and bottom-up visions agree on the time-phasing of global total and wetland emission anomalies. The process-discriminating inversion indicates that wetlands dominate the time-variability of methane emissions (90% of the total variability). The contribution of tropical wetlands to the anomalies is found to be large, especially during the post-Pinatubo years (global negative anomalies with minima between -41 and -19 Tg yr(-1) in 1992) and during the alternate 1997-1998 El-Nino/1998-1999 La-Nina (maximal anomalies in tropical regions between +16 and +22 Tg yr(-1) for the inversions and anomalies due to tropical wetlands between +12 and +17 Tg yr(-1) for the process-based model). Between 2000 and 2006, during the stagnation of methane concentrations in the atmosphere, the top-down and bottom-up approaches agree on the fact that South America is the main region contributing to anomalies in natural wetland emissions, but they disagree on the sign and magnitude of the flux trend in the Amazon basin. A negative trend (-3.9 +/- 1.3 Tg yr(-1)) is inferred by the process-discriminating inversion whereas a positive trend (+1.3 +/- 0.3 Tg yr(-1)) is found by the process model. Although processed-based models have their own caveats and may not take into account all processes, the positive trend found by the B-U approach is considered more likely because it is a robust feature of the process-based model, consistent with analysed precipitations and the satellite-derived extent of inundated areas. On the contrary, the surface-data based inversions lack constraints for South America. This result suggests the need for a re-interpretation of the large increase found in anthropogenic methane inventories after 2000.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Photoluminescence and photocatalytic properties of Eu-doped ZnO nanoparticles (NPs) were synthesized by facile phyto route. XPS results demonstrated the existence of Eu3+ as dopant into ZnO. Morphologies of the NPs were mainly dependent on Eu3+ and Aloe vera gel. Red shift of energy band gap was due to the creation of intermediate energy states of Eu3+ and oxygen vacancies in the band gap. PL emission of ZnO:Eu3+ (1-11 mol%, 8 ml and 7 mol%, 2-12 ml) exhibit characteristic peaks of D-5(0) -> F-7(2) transitions. From the Judd-Ofelt analysis, intensities of transitions between different.' levels dependent on the symmetry of the local environment of Eu3+ ions. CIE chromaticity co-ordinates confirm reddish emission of the phosphor. Further, NPs exhibit excellent photocatalytic activity for the degradation of Rhodamine B (94%) under Sunlight was attributed to crystallite size, band gap, morphology and oxygen vacancies. In addition, photocatalyst reusability studies were conducted and found that Eu-doped catalyst could be reused several times with negligible decrease in catalytic activity. The present work directs new possibilities to provide some new insights into the design of new phyto synthesized nanophosphors for display devices, photocatalysts with high activity for environmental clean-up and solar energy conversion. (C) 2015 Elsevier B.V. All rights reserved.