781 resultados para big data storage


Relevância:

30.00% 30.00%

Publicador:

Resumo:

EXTRACT (SEE PDF FOR FULL ABSTRACT): This is a previous presentation of what has been observed in points spread in Mexico. The existing data amount is large enough that an atlas was given out in 1977. This atlas has information which goes back to the beginning of the country. The original data sets from which this atlas was issued exist in a variety of storage forms ranging from simple paper blocks up to books and magnetic tapes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cultured Macrobrachium rosenbergii (Scampi, about 30 g each) in headless shell-on form was individually quick frozen in a spiral freezer. The frozen samples were glazed and packed in polythene bags, which were further packed in master carton and stored at -18°C. Samples were drawn at regular intervals and subjected to biochemical, bacteriological and organoleptic analysis to study its storage characteristics. The data on the above parameters showed that the samples were in prime acceptable condition when stored up to 23 weeks. No appreciable change in colour and odour was noticed in the raw muscle. Afterwards, organoleptic evaluation of the cooked muscle revealed slight change in the flavour. Texture also appeared little tougher. These changes in organoleptic characters were well supported by the biochemical bacteriological changes in the muscle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to improve algal biofuel production on a commercial-scale, an understanding of algal growth and fuel molecule accumulation is essential. A mathematical model is presented that describes biomass growth and storage molecule (TAG lipid and starch) accumulation in the freshwater microalga Chlorella vulgaris, under mixotrophic and autotrophic conditions. Biomass growth was formulated based on the Droop model, while the storage molecule production was calculated based on the carbon balance within the algal cells incorporating carbon fixation via photosynthesis, organic carbon uptake and functional biomass growth. The model was validated with experimental growth data of C. vulgaris and was found to fit the data well. Sensitivity analysis showed that the model performance was highly sensitive to variations in parameters associated with nutrient factors, photosynthesis and light intensity. The maximum productivity and biomass concentration were achieved under mixotrophic nitrogen sufficient conditions, while the maximum storage content was obtained under mixotrophic nitrogen deficient conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to improve algal biofuel production on a commercial-scale, an understanding of algal growth and fuel molecule accumulation is essential. A mathematical model is presented that describes biomass growth and storage molecule (TAG lipid and starch) accumulation in the freshwater microalga Chlorella vulgaris, under mixotrophic and autotrophic conditions. Biomass growth was formulated based on the Droop model, while the storage molecule production was calculated based on the carbon balance within the algal cells incorporating carbon fixation via photosynthesis, organic carbon uptake and functional biomass growth. The model was validated with experimental growth data of C. vulgaris and was found to fit the data well. Sensitivity analysis showed that the model performance was highly sensitive to variations in parameters associated with nutrient factors, photosynthesis and light intensity. The maximum productivity and biomass concentration were achieved under mixotrophic nitrogen sufficient conditions, while the maximum storage content was obtained under mixotrophic nitrogen deficient conditions. © 2014 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The original description of Myxobolus longisporus Nie et Li, 1992, the species infecting gills of Cyprinus carpio haematopterus L., is supplemented with new data on the spore morphology and pathogenicity. Spores are elongate pyriform with pointed anterior end, 15.7 (15.5-16.5) mum long, 6.7 (6-8) mum wide and 5.5 mum thick. Sutural ridge is straight and narrow. Mucus envelope is lacking. Two equal-sized elongate pyriform polar capsules are 8.5 mum long and 2.5 mum wide with convergent long axes. Polar filament coiled perpendicularly to the long axis of the capsule makes 9 (8-10) turns. Posterior end of polar capsules exceeds mid-spore by 15-20%. Cyst-like plasmodia are localised in the gill secondary lamellae. The infection is described in adult big host specimens. Gross lesions manifested as dark red colouration of gill tissues were restricted to the ventral part of the first gill arches. Remarkable site specificity (apical part of secondary lamellae) was observed in the course of development of microscopic lesions. M. longisporus is characterised also on the molecular level using sequences of SSU rRNA gene. Phylogenetic analysis based on these sequences has allowed clearer phylogenetic relationships to be established with other species of the genus Myxobolus sequenced to date.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The production of C, N, O elements in a standard big bang nucleosynthesis scenario is investigated. Using the up-to-date data of nuclear reactions in BBN, in particular the Li-8 (n, gamma) Li-9 which has been measured in China Institute of Atomic Energy, a full nucleosynthesis network calculation of BBN is carried out. Our calculation results show that the abundance of C-12 is increased for an order of magnitude after addition of the reaction chain Li-8(n, gamma) Li-9(alpha, n) B-12(beta) C-12, which was neglected in previous studies. We find that this sequence provides the main channel to convert the light elements into C, N, O in standard BBN.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract—Personal communication devices are increasingly being equipped with sensors that are able to passively collect information from their surroundings – information that could be stored in fairly small local caches. We envision a system in which users of such devices use their collective sensing, storage, and communication resources to query the state of (possibly remote) neighborhoods. The goal of such a system is to achieve the highest query success ratio using the least communication overhead (power). We show that the use of Data Centric Storage (DCS), or directed placement, is a viable approach for achieving this goal, but only when the underlying network is well connected. Alternatively, we propose, amorphous placement, in which sensory samples are cached locally and informed exchanges of cached samples is used to diffuse the sensory data throughout the whole network. In handling queries, the local cache is searched first for potential answers. If unsuccessful, the query is forwarded to one or more direct neighbors for answers. This technique leverages node mobility and caching capabilities to avoid the multi-hop communication overhead of directed placement. Using a simplified mobility model, we provide analytical lower and upper bounds on the ability of amorphous placement to achieve uniform field coverage in one and two dimensions. We show that combining informed shuffling of cached samples upon an encounter between two nodes, with the querying of direct neighbors could lead to significant performance improvements. For instance, under realistic mobility models, our simulation experiments show that amorphous placement achieves 10% to 40% better query answering ratio at a 25% to 35% savings in consumed power over directed placement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neural network models of working memory, called Sustained Temporal Order REcurrent (STORE) models, are described. They encode the invariant temporal order of sequential events in short term memory (STM) in a way that mimics cognitive data about working memory, including primacy, recency, and bowed order and error gradients. As new items are presented, the pattern of previously stored items is invariant in the sense that, relative activations remain constant through time. This invariant temporal order code enables all possible groupings of sequential events to be stably learned and remembered in real time, even as new events perturb the system. Such a competence is needed to design self-organizing temporal recognition and planning systems in which any subsequence of events may need to be categorized in order to to control and predict future behavior or external events. STORE models show how arbitrary event sequences may be invariantly stored, including repeated events. A preprocessor interacts with the working memory to represent event repeats in spatially separate locations. It is shown why at least two processing levels are needed to invariantly store events presented with variable durations and interstimulus intervals. It is also shown how network parameters control the type and shape of primacy, recency, or bowed temporal order gradients that will be stored.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: To collect oncologists' experience and opinion on adjuvant chemotherapy in elderly breast cancer patients. MATERIALS AND METHODS: A questionnaire was circulated among the members of the Breast International Group. RESULTS: A total of 277 oncologists from 28 countries participated in the survey. Seventy years is the age cut-off commonly used to define a patient as elderly. Biological age and the biological characteristics of the tumor are the most frequently used criteria to propose adjuvant chemotherapy to an elderly patient. Combination therapy with cyclophosphamide, methotrexate and fluorouracil on days 1 and 8 is the most frequently prescribed regimen. Great interest exists in oral chemotherapy. CONCLUSION: There is interest among those who responded to the survey to validate a comprehensive geriatric assessment for use as a predictive instrument of toxicity and/or activity of anticancer therapy and to evaluate the role of a treatment option that is potentially less toxic and possibly as effective as polychemotherapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As more diagnostic testing options become available to physicians, it becomes more difficult to combine various types of medical information together in order to optimize the overall diagnosis. To improve diagnostic performance, here we introduce an approach to optimize a decision-fusion technique to combine heterogeneous information, such as from different modalities, feature categories, or institutions. For classifier comparison we used two performance metrics: The receiving operator characteristic (ROC) area under the curve [area under the ROC curve (AUC)] and the normalized partial area under the curve (pAUC). This study used four classifiers: Linear discriminant analysis (LDA), artificial neural network (ANN), and two variants of our decision-fusion technique, AUC-optimized (DF-A) and pAUC-optimized (DF-P) decision fusion. We applied each of these classifiers with 100-fold cross-validation to two heterogeneous breast cancer data sets: One of mass lesion features and a much more challenging one of microcalcification lesion features. For the calcification data set, DF-A outperformed the other classifiers in terms of AUC (p < 0.02) and achieved AUC=0.85 +/- 0.01. The DF-P surpassed the other classifiers in terms of pAUC (p < 0.01) and reached pAUC=0.38 +/- 0.02. For the mass data set, DF-A outperformed both the ANN and the LDA (p < 0.04) and achieved AUC=0.94 +/- 0.01. Although for this data set there were no statistically significant differences among the classifiers' pAUC values (pAUC=0.57 +/- 0.07 to 0.67 +/- 0.05, p > 0.10), the DF-P did significantly improve specificity versus the LDA at both 98% and 100% sensitivity (p < 0.04). In conclusion, decision fusion directly optimized clinically significant performance measures, such as AUC and pAUC, and sometimes outperformed two well-known machine-learning techniques when applied to two different breast cancer data sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Sharing of epidemiological and clinical data sets among researchers is poor at best, in detriment of science and community at large. The purpose of this paper is therefore to (1) describe a novel Web application designed to share information on study data sets focusing on epidemiological clinical research in a collaborative environment and (2) create a policy model placing this collaborative environment into the current scientific social context. METHODOLOGY: The Database of Databases application was developed based on feedback from epidemiologists and clinical researchers requiring a Web-based platform that would allow for sharing of information about epidemiological and clinical study data sets in a collaborative environment. This platform should ensure that researchers can modify the information. A Model-based predictions of number of publications and funding resulting from combinations of different policy implementation strategies (for metadata and data sharing) were generated using System Dynamics modeling. PRINCIPAL FINDINGS: The application allows researchers to easily upload information about clinical study data sets, which is searchable and modifiable by other users in a wiki environment. All modifications are filtered by the database principal investigator in order to maintain quality control. The application has been extensively tested and currently contains 130 clinical study data sets from the United States, Australia, China and Singapore. Model results indicated that any policy implementation would be better than the current strategy, that metadata sharing is better than data-sharing, and that combined policies achieve the best results in terms of publications. CONCLUSIONS: Based on our empirical observations and resulting model, the social network environment surrounding the application can assist epidemiologists and clinical researchers contribute and search for metadata in a collaborative environment, thus potentially facilitating collaboration efforts among research communities distributed around the globe.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Problems in the preservation of the quality of granular material products are complex and arise from a series of sources during transport and storage. In either designing a new plant or, more likely, analysing problems that give rise to product quality degradation in existing operations, practical measurement and simulation tools and technologies are required to support the process engineer. These technologies are required to help in both identifying the source of such problems and then designing them out. As part of a major research programme on quality in particulate manufacturing computational models have been developed for segregation in silos, degradation in pneumatic conveyors, and the development of caking during storage, which use where possible, micro-mechanical relationships to characterize the behaviour of granular materials. The objective of the work presented here is to demonstrate the use of these computational models of unit processes involved in the analysis of large-scale processes involving the handling of granular materials. This paper presents a set of simulations of a complete large-scale granular materials handling operation, involving the discharge of the materials from a silo, its transport through a dilute-phase pneumatic conveyor, and the material storage in a big bag under varying environmental temperature and humidity conditions. Conclusions are drawn on the capability of the computational models to represent key granular processes, including particle size segregation, degradation, and moisture migration caking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article provides a broad overview of project HEED (High-rise Evacuation Evaluation Database) and the methodologies employed in the collection and storage of first-hand accounts of evacuation experiences derived from face-to-face interviews of evacuees from the World Trade Center (WTC) Twin Towers complex on September 11, 2001. In particular, the article describes the development of the HEED database. This is a flexible research tool which contains qualitative type data in the form of coded evacuee experiences along with the full interview transcripts. The data and information captured and stored in the HEED database is not only unique, but provides a means to address current and emerging issues relating to human factors associated with the evacuation of high-rise buildings

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Progress in microbiology has always been driven by technological advances, ever since Antonie van Leeuwenhoek discovered bacteria by making an improved compound microscope. However, until very recently we have not been able to identify microbes and record their mostly invisible activities, such as nutrient consumption or toxin production on the level of the single cell, not even in the laboratory. This is now changing with the rapid rise of exciting new technologies for single-cell microbiology (1, 2), which enable microbiologists to do what plant and animal ecologists have been doing for a long time: observe who does what, when, where, and next to whom. Single cells taken from the environment can be identified and even their genomes sequenced. Ex situ, their size, elemental, and biochemical composition, as well as other characteristics can be measured with high-throughput and cells sorted accordingly. Even better, individual microbes can be observed in situ with a range of novel microscopic and spectroscopic methods, enabling localization, identification, or functional characterization of cells in a natural sample, combined with detecting uptake of labeled compounds. Alternatively, they can be placed into fabricated microfluidic environments, where they can be positioned, exposed to stimuli, monitored, and their interactions controlled “in microfluido.” By introducing genetically engineered reporter cells into a fabricated landscape or a microcosm taken from nature, their reproductive success or activity can be followed, or their sensing of their local environment recorded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The QICS controlled release experiment demonstrates that leaks of carbon dioxide (CO2) gas can be detected by monitoring acoustic, geochemical and biological parameters within a given marine system. However the natural complexity and variability of marine system responses to (artificial) leakage strongly suggests that there are no absolute indicators of leakage or impact that can unequivocally and universally be used for all potential future storage sites. We suggest a multivariate, hierarchical approach to monitoring, escalating from anomaly detection to attribution, quantification and then impact assessment, as required. Given the spatial heterogeneity of many marine ecosystems it is essential that environmental monitoring programmes are supported by a temporally (tidal, seasonal and annual) and spatially resolved baseline of data from which changes can be accurately identified. In this paper we outline and discuss the options for monitoring methodologies and identify the components of an appropriate baseline survey.