22 resultados para Anoxia and normoxia and Storage mobilization

em Universidad Politécnica de Madrid


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A total of 72 eggs from a group of 100 white laying hens housed in standard cages were analyzed. Thirty-six eggs were retired when the hens had 44 week of age and the other 36 eggs were retired eight weeks afterwards. Each group of 36 eggs was radomly divided in three groups of 12 eggs. First group was analyzed at once (storage system C); second one was kept during one week in the refrigerator (5ºC) (storage system R), and third group were kept also one week but on ambient temperature (25ºC) (storage system ET). The hen age, egg weight and storage system had not significant (P>0.05) effect on shell thickness. The specific gravity (SG) has a positive relation with shell quality. The egg class and storage system significantly (P<0,05) affected to SG, while no influence of bird age on this variable was observed. The yolk color increased with hen age but storage system had not effect on this variable. The increase of the hen age and the R and AT storage systems significantly (P<0.05) reduced albumen height (H) and the interaction hen age x storage system was significant (P<0.025) for this variable. The reduction of the H due to R and ET storage systems was higher in the eggs from hens with 52 weeks of age than in those from hens with 44 weeks of age. The Haugh units (HU) was significantly (P<0.05) affected by hen age, egg class and storage system. The hen age increase reduced HU and the R and ET eggs had lower HU than C eggs. It is concluded that the bird age and storage system with high temperatures reduced the egg quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A total of 108 eggs from a group of 100 brown laying hens housed in standard cages were analyzed. Thirty-six eggs were retired when the hens had 30 week of age, other 36 eggs were retired when the hens had 35 week of age and the remaining 36 eggs were retired five weeks afterwards. Each group of 36 eggs was radomly divided in three groups of 12 eggs. First group was analyzed at once, second group one was kept during one week in the refrigerator (5°C) and third group was kept also one week but on ambient temperature (25°C). Shell color, shell thickness, specific gravity, albumen height and Haugh units wre obtained. The bird age had significant effect on shell color and shell thickness, but the storage system had not influence on such variables. The hen age had not effect on specific gravity, but the storage system affected to this variable. Hen age and storage system had significant influence (P<0.05) on albumen height and Haugh units, and the interaction age × storage system was significant for these variables. The specific gravity had positive relations with shell thickness, yolk color, albumen height and Haugh units. It is concluded that bird age and storage system under high temperatures reduced the egg quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Samples of "Golden" and "Granny Smith" apples and "Conference" and "Doyenne of Cornice" pears have been tested. A great effect of storage conditions has been detected for pear but not for apple varieties. Both apple cultivars show to be equally resistant to quasi-static and to dinamic loading while pear varieties show great differences. All these effects can be quantified in order to describe mathematically species and varieties behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the rising prices of the retail electricity and the decreasing cost of the PV technology, grid parity with commercial electricity will soon become a reality in Europe. This fact, together with less attractive PV feed-in-tariffs in the near future and incentives to promote self-consumption suggest, that new operation modes for the PV Distributed Generation should be explored; differently from the traditional approach which is only based on maximizing the exported electricity to the grid. The smart metering is experiencing a growth in Europe and the United States but the possibilities of its use are still uncertain, in our system we propose their use to manage the storage and to allow the user to know their electrical power and energy balances. The ADSM has many benefits studied previously but also it has important challenges, in this paper we can observe and ADSM implementation example where we propose a solution to these challenges. In this paper we study the effects of the Active Demand-Side Management (ADSM) and storage systems in the amount of consumed local electrical energy. It has been developed on a prototype of a self-sufficient solar house called “MagicBox” equipped with grid connection, PV generation, lead–acid batteries, controllable appliances and smart metering. We carried out simulations for long-time experiments (yearly studies) and real measures for short and mid-time experiments (daily and weekly studies). Results show the relationship between the electricity flows and the storage capacity, which is not linear and becomes an important design criterion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A one-step extraction procedure and a leaching column experiment were performed to assess the effects of citric and tartaric acids on Cu and Zn mobilization in naturally contaminated mine soils to facilitate assisted phytoextraction. A speciation modeling of the soil solution and the metal fractionation of soils were performed to elucidate the chemical processes that affected metal desorption by organic acids. Different extracting solutions were prepared, all of which contained 0.01 M KNO3 and different concentrations of organic acids: control without organic acids, 0.5 mM citric, 0.5 mM tartaric, 10 mM citric, 10 mM tartaric, and 5 mM citric +5 mM tartaric. The results of the extraction procedure showed that higher concentrations of organic acids increased metal desorption, and citric acid was more effective at facilitating metal desorption than tartaric acid. Metal desorption was mainly influenced by the decreasing pH and the dissolution of Fe and Mn oxides, not by the formation of soluble metal–organic complexes as was predicted by the speciation modeling. The results of the column study reported that low concentrations of organic acids did not significantly increase metal mobilization and that higher doses were also not able to mobilize Zn. However, 5–10 mM citric acid significantly promoted Cu mobilization (from 1 mg kg−1 in the control to 42 mg kg−1 with 10 mM citric acid) and reduced the exchangeable (from 21 to 3 mg kg−1) and the Fe and Mn oxides (from 443 to 277 mg kg−1) fractions. Citric acid could efficiently facilitate assisted phytoextraction techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the present uncertain global context of reaching an equal social stability and steady thriving economy, power demand expected to grow and global electricity generation could nearly double from 2005 to 2030. Fossil fuels will remain a significant contribution on this energy mix up to 2050, with an expected part of around 70% of global and ca. 60% of European electricity generation. Coal will remain a key player. Hence, a direct effect on the considered CO2 emissions business-as-usual scenario is expected, forecasting three times the present CO2 concentration values up to 1,200ppm by the end of this century. Kyoto protocol was the first approach to take global responsibility onto CO2 emissions monitoring and cap targets by 2012 with reference to 1990. Some of principal CO2emitters did not ratify the reduction targets. Although USA and China spur are taking its own actions and parallel reduction measures. More efficient combustion processes comprising less fuel consuming, a significant contribution from the electricity generation sector to a CO2 dwindling concentration levels, might not be sufficient. Carbon Capture and Storage (CCS) technologies have started to gain more importance from the beginning of the decade, with research and funds coming out to drive its come in useful. After first researching projects and initial scale testing, three principal capture processes came out available today with first figures showing up to 90% CO2 removal by its standard applications in coal fired power stations. Regarding last part of CO2 reduction chain, two options could be considered worthy, reusing (EOR & EGR) and storage. The study evaluates the state of the CO2 capture technology development, availability and investment cost of the different technologies, with few operation cost analysis possible at the time. Main findings and the abatement potential for coal applications are presented. DOE, NETL, MIT, European universities and research institutions, key technology enterprises and utilities, and key technology suppliers are the main sources of this study. A vision of the technology deployment is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of the RAP-WAM AND-parallel Prolog abstract architecture is to provide inference speeds significantly beyond those of sequential systems, while supporting Prolog semantics and preserving sequential performance and storage efficiency. This paper presents simulation results supporting these claims with special emphasis on memory performance on a two-level sharedmemory multiprocessor organization. Several solutions to the cache coherency problem are analyzed. It is shown that RAP-WAM offers good locality and storage efficiency and that it can effectively take advantage of broadcast caches. It is argued that speeds in excess of 2 ML IPS on real applications exhibiting medium parallelism can be attained with current technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A backtracking algorithm for AND-Parallelism and its implementation at the Abstract Machine level are presented: first, a class of AND-Parallelism models based on goal independence is defined, and a generalized version of Restricted AND-Parallelism (RAP) introduced as characteristic of this class. A simple and efficient backtracking algorithm for R A P is then discussed. An implementation scheme is presented for this algorithm which offers minimum overhead, while retaining the performance and storage economy of sequent ial implementations and taking advantage of goal independence to avoid unnecessary backtracking ("restricted intelligent backtracking"). Finally, the implementation of backtracking in sequential and AND-Parallcl systems is explained through a number of examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although the sequential execution speed of logic programs has been greatly improved by the concepts introduced in the Warren Abstract Machine (WAM), parallel execution represents the only way to increase this speed beyond the natural limits of sequential systems. However, most proposed parallel logic programming execution models lack the performance optimizations and storage efficiency of sequential systems. This paper presents a parallel abstract machine which is an extension of the WAM and is thus capable of supporting ANDParallelism without giving up the optimizations present in sequential implementations. A suitable instruction set, which can be used as a target by a variety of logic programming languages, is also included. Special instructions are provided to support a generalized version of "Restricted AND-Parallelism" (RAP), a technique which reduces the overhead traditionally associated with the run-time management of variable binding conflicts to a series of simple run-time checks, which select one out of a series of compiled execution graphs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Machine learning techniques are used for extracting valuable knowledge from data. Nowa¬days, these techniques are becoming even more important due to the evolution in data ac¬quisition and storage, which is leading to data with different characteristics that must be exploited. Therefore, advances in data collection must be accompanied with advances in machine learning techniques to solve new challenges that might arise, on both academic and real applications. There are several machine learning techniques depending on both data characteristics and purpose. Unsupervised classification or clustering is one of the most known techniques when data lack of supervision (unlabeled data) and the aim is to discover data groups (clusters) according to their similarity. On the other hand, supervised classification needs data with supervision (labeled data) and its aim is to make predictions about labels of new data. The presence of data labels is a very important characteristic that guides not only the learning task but also other related tasks such as validation. When only some of the available data are labeled whereas the others remain unlabeled (partially labeled data), neither clustering nor supervised classification can be used. This scenario, which is becoming common nowadays because of labeling process ignorance or cost, is tackled with semi-supervised learning techniques. This thesis focuses on the branch of semi-supervised learning closest to clustering, i.e., to discover clusters using available labels as support to guide and improve the clustering process. Another important data characteristic, different from the presence of data labels, is the relevance or not of data features. Data are characterized by features, but it is possible that not all of them are relevant, or equally relevant, for the learning process. A recent clustering tendency, related to data relevance and called subspace clustering, claims that different clusters might be described by different feature subsets. This differs from traditional solutions to data relevance problem, where a single feature subset (usually the complete set of original features) is found and used to perform the clustering process. The proximity of this work to clustering leads to the first goal of this thesis. As commented above, clustering validation is a difficult task due to the absence of data labels. Although there are many indices that can be used to assess the quality of clustering solutions, these validations depend on clustering algorithms and data characteristics. Hence, in the first goal three known clustering algorithms are used to cluster data with outliers and noise, to critically study how some of the most known validation indices behave. The main goal of this work is however to combine semi-supervised clustering with subspace clustering to obtain clustering solutions that can be correctly validated by using either known indices or expert opinions. Two different algorithms are proposed from different points of view to discover clusters characterized by different subspaces. For the first algorithm, available data labels are used for searching for subspaces firstly, before searching for clusters. This algorithm assigns each instance to only one cluster (hard clustering) and is based on mapping known labels to subspaces using supervised classification techniques. Subspaces are then used to find clusters using traditional clustering techniques. The second algorithm uses available data labels to search for subspaces and clusters at the same time in an iterative process. This algorithm assigns each instance to each cluster based on a membership probability (soft clustering) and is based on integrating known labels and the search for subspaces into a model-based clustering approach. The different proposals are tested using different real and synthetic databases, and comparisons to other methods are also included when appropriate. Finally, as an example of real and current application, different machine learning tech¬niques, including one of the proposals of this work (the most sophisticated one) are applied to a task of one of the most challenging biological problems nowadays, the human brain model¬ing. Specifically, expert neuroscientists do not agree with a neuron classification for the brain cortex, which makes impossible not only any modeling attempt but also the day-to-day work without a common way to name neurons. Therefore, machine learning techniques may help to get an accepted solution to this problem, which can be an important milestone for future research in neuroscience. Resumen Las técnicas de aprendizaje automático se usan para extraer información valiosa de datos. Hoy en día, la importancia de estas técnicas está siendo incluso mayor, debido a que la evolución en la adquisición y almacenamiento de datos está llevando a datos con diferentes características que deben ser explotadas. Por lo tanto, los avances en la recolección de datos deben ir ligados a avances en las técnicas de aprendizaje automático para resolver nuevos retos que pueden aparecer, tanto en aplicaciones académicas como reales. Existen varias técnicas de aprendizaje automático dependiendo de las características de los datos y del propósito. La clasificación no supervisada o clustering es una de las técnicas más conocidas cuando los datos carecen de supervisión (datos sin etiqueta), siendo el objetivo descubrir nuevos grupos (agrupaciones) dependiendo de la similitud de los datos. Por otra parte, la clasificación supervisada necesita datos con supervisión (datos etiquetados) y su objetivo es realizar predicciones sobre las etiquetas de nuevos datos. La presencia de las etiquetas es una característica muy importante que guía no solo el aprendizaje sino también otras tareas relacionadas como la validación. Cuando solo algunos de los datos disponibles están etiquetados, mientras que el resto permanece sin etiqueta (datos parcialmente etiquetados), ni el clustering ni la clasificación supervisada se pueden utilizar. Este escenario, que está llegando a ser común hoy en día debido a la ignorancia o el coste del proceso de etiquetado, es abordado utilizando técnicas de aprendizaje semi-supervisadas. Esta tesis trata la rama del aprendizaje semi-supervisado más cercana al clustering, es decir, descubrir agrupaciones utilizando las etiquetas disponibles como apoyo para guiar y mejorar el proceso de clustering. Otra característica importante de los datos, distinta de la presencia de etiquetas, es la relevancia o no de los atributos de los datos. Los datos se caracterizan por atributos, pero es posible que no todos ellos sean relevantes, o igualmente relevantes, para el proceso de aprendizaje. Una tendencia reciente en clustering, relacionada con la relevancia de los datos y llamada clustering en subespacios, afirma que agrupaciones diferentes pueden estar descritas por subconjuntos de atributos diferentes. Esto difiere de las soluciones tradicionales para el problema de la relevancia de los datos, en las que se busca un único subconjunto de atributos (normalmente el conjunto original de atributos) y se utiliza para realizar el proceso de clustering. La cercanía de este trabajo con el clustering lleva al primer objetivo de la tesis. Como se ha comentado previamente, la validación en clustering es una tarea difícil debido a la ausencia de etiquetas. Aunque existen muchos índices que pueden usarse para evaluar la calidad de las soluciones de clustering, estas validaciones dependen de los algoritmos de clustering utilizados y de las características de los datos. Por lo tanto, en el primer objetivo tres conocidos algoritmos se usan para agrupar datos con valores atípicos y ruido para estudiar de forma crítica cómo se comportan algunos de los índices de validación más conocidos. El objetivo principal de este trabajo sin embargo es combinar clustering semi-supervisado con clustering en subespacios para obtener soluciones de clustering que puedan ser validadas de forma correcta utilizando índices conocidos u opiniones expertas. Se proponen dos algoritmos desde dos puntos de vista diferentes para descubrir agrupaciones caracterizadas por diferentes subespacios. Para el primer algoritmo, las etiquetas disponibles se usan para bus¬car en primer lugar los subespacios antes de buscar las agrupaciones. Este algoritmo asigna cada instancia a un único cluster (hard clustering) y se basa en mapear las etiquetas cono-cidas a subespacios utilizando técnicas de clasificación supervisada. El segundo algoritmo utiliza las etiquetas disponibles para buscar de forma simultánea los subespacios y las agru¬paciones en un proceso iterativo. Este algoritmo asigna cada instancia a cada cluster con una probabilidad de pertenencia (soft clustering) y se basa en integrar las etiquetas conocidas y la búsqueda en subespacios dentro de clustering basado en modelos. Las propuestas son probadas utilizando diferentes bases de datos reales y sintéticas, incluyendo comparaciones con otros métodos cuando resulten apropiadas. Finalmente, a modo de ejemplo de una aplicación real y actual, se aplican diferentes técnicas de aprendizaje automático, incluyendo una de las propuestas de este trabajo (la más sofisticada) a una tarea de uno de los problemas biológicos más desafiantes hoy en día, el modelado del cerebro humano. Específicamente, expertos neurocientíficos no se ponen de acuerdo en una clasificación de neuronas para la corteza cerebral, lo que imposibilita no sólo cualquier intento de modelado sino también el trabajo del día a día al no tener una forma estándar de llamar a las neuronas. Por lo tanto, las técnicas de aprendizaje automático pueden ayudar a conseguir una solución aceptada para este problema, lo cual puede ser un importante hito para investigaciones futuras en neurociencia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article presents a solution to the problem of strong authentication, portable and expandable using a combination of Java technology and storage of X.509 digital certificate in Java cards to access services offered by an institution, in this case, the technology of the University of Panama, ensuring the authenticity, confidentiality, integrity and non repudiation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CO2 capture and storage (CCS) projects are presently developed to reduce the emission of anthropogenic CO2 into the atmosphere. CCS technologies are expected to account for the 20% of the CO2 reduction by 2050. One of the main concerns of CCS is whether CO2 may remain confined within the geological formation into which it is injected since post-injection CO2 migration in the time scale of years, decades and centuries is not well understood. Theoretically, CO2 can be retained at depth i) as a supercritical fluid (physical trapping), ii) as a fluid slowly migrating in an aquifer due to long flow path (hydrodynamic trapping), iii) dissolved into ground waters (solubility trapping) and iv) precipitated secondary carbonates. Carbon dioxide will be injected in the near future (2012) at Hontomín (Burgos, Spain) in the frame of the Compostilla EEPR project, led by the Fundación Ciudad de la Energía (CIUDEN). In order to detect leakage in the operational stage, a pre-injection geochemical baseline is presently being developed. In this work a geochemical monitoring design is presented to provide information about the feasibility of CO2 storage at depth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Connexin-43 (Cx43), a gap junction protein involved in control of cell proliferation, differentiation and migration, has been suggested to have a role in hematopoiesis. Cx43 is highly expressed in osteoblasts and osteogenic progenitors (OB/P). To elucidate the biologic function of Cx43 in the hematopoietic microenvironment (HM) and its influence in hematopoietic stem cell (HSC) activity, we studied the hematopoietic function in an in vivo model of constitutive deficiency of Cx43 in OB/P. The deficiency of Cx43 in OB/P cells does not impair the steady state hematopoiesis, but disrupts the directional trafficking of HSC/progenitors (Ps) between the bone marrow (BM) and peripheral blood (PB). OB/P Cx43 is a crucial positive regulator of transstromal migration and homing of both HSCs and progenitors in an irradiated microenvironment. However, OB/P Cx43 deficiency in nonmyeloablated animals does not result in a homing defect but induces increased endosteal lodging and decreased mobilization of HSC/Ps associated with proliferation and expansion of Cxcl12-secreting mesenchymal/osteolineage cells in the BM HM in vivo. Cx43 controls the cellular content of the BM osteogenic microenvironment and is required for homing of HSC/Ps in myeloablated animals

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CO2 capture and storage (CCS) projects are presently developed to reduce the emission of anthropogenic CO2 into the atmosphere. CCS technologies are expected to account for the 20% of the CO2 reduction by 2050. Geophysical, ground deformation and geochemical monitoring have been carried out to detect potential leakage, and, in the event that this occurs, identify and quantify it. This monitoring needs to be developed prior, during and after the injection stage. For a correct interpretation and quantification of the leakage, it is essential to establish a pre-injection characterization (baseline) of the area affected by the CO2 storage at reservoir level as well as at shallow depth, surface and atmosphere, via soil gas measurements. Therefore, the methodological approach is important because it can affect the spatial and temporal variability of this flux and even jeopardize the total value of CO2 in a given area. In this sense, measurements of CO2 flux were done using portable infrared analyzers (i.e., accumulation chambers) adapted to monitoring the geological storage of CO2, and other measurements of trace gases, e.g. radon isotopes and remote sensing imagery were tested in the natural analogue of Campo de Calatrava (Ciudad Real, Spain) with the aim to apply in CO2 leakage detection; thus, observing a high correlation between CO2 and radon (r=0,858) and detecting some vegetation indices that may be successfully applied for the leakage detection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

C0 capture and storage (CCS) projects are presently developed to reduce the emission of anthropogenic co2 into the atmosphere. CCS technologies are expected to account for the 20% of the C0 reduction by 2050.The results of this paper are referred to the OXYCFB300 Compostilla Project (European Energy Program for Recover). Since the detection and control of potential leakage from storage formation is mandatory in a project of capture and geological storage of C02 (CCS), geophysical , ground deformation and geochemical monitoring have been carried out to detect potentialleakage, and, in the event that this occurs, identify and quantify it. This monitoring needs to be developed prior, during and after the injection stage. For a correct interpretation and quantification of the leakage, it is essential to establish a pre-injection characterization (baseline)of the area affected by the C02 storage at reservoir level as well as at shallow depth, surface and atmosphere, via soil gas measurements.