959 resultados para user-created content
Resumo:
The effects of the addition of heated oils to feeds (3%, w/w) and the dietary supplementation with a-tocopheryl acetate (TA; 100 mg/kg) and Zn (200 mg/kg) on rabbit tissue fatty acid (FA) composition and on the Zn, Cu, Fe and Se content in meat were assessed. Heating unrefined sunflower oil (SO) at 558C for 245 h increased its content in primary oxidation products and reduced its a-tocopherol content. However, this did not significantly affect tissue FA composition. Heating SO at 1408C for 31 h increased its content in secondary oxidation products and in some FA isomers asc9,t11-CLA and di-trans CLA. This led to increases in di-trans CLA in liver and in t9,c12-18:2 in meat. The c9,t11-CLA was the most incorporated CLA isomer in tissues. The dietary supplementation with a-TA did not affect the FA composition of plasma, liver or meat. The cooking of vacuum-packed rabbit meat at 788C for 5 min reduced significantly but slightly its polyunsaturated FA content. The dietary supplementation with Zn did not modify the content of Zn, Fe or Se in meat, but it reduced its Cu content. On the other hand, it increased the content of some FAs in meat when SO heated at 1408C for 31 h was added to feeds.
Resumo:
Regular aerobic exercise training, which is touted as a way to ameliorate metabolic diseases, increases aerobic capacity. Aerobic capacity usually declines with advanced age. The decline in aerobic capacity is typically associated by a decrease in the quality of skeletal muscle. At the molecular level, this decreased quality comes in part from perturbations in skeletal muscle mitochondria. Of particular is a decrease in the total amount of mitochondria that occupy the skeletal muscle volume. What is not well established is if this decrease in mitochondrial content is due to inactive lifestyle or the process of aging. Herein, the work of the thesis shows a clear connection between mitochondrial content and aerobic capacity. This indicates that active individuals with higher VChmax levels also contain higher volumes of mitochondria inside their muscle as opposed to sedentary counterparts who have lower levels of mitochondrial content. Upon taking these previously sedentary individuals and entering them into an aerobic exercise intervention, they are able to recover their mitochondrial content as well as function to similar levels of lifelong athletes of the same age. Furthermore, the results of this thesis show that mitochondrial content and function also correlate with exercise efficiency. If one is more efficient, he/she is able to expend less energy for a similar power output. Furthermore, individuals who increase in efficiency also increase in the ability to oxidize and utilize fat during pro-longed exercise. This increased reliance on fat after the intervention is associated with an increased amount of mitochondria, particularly in the intermyofibrillar region of skeletal muscle. Therefore, elderly adults who were once sedentary were able to recover mitochondrial content and function and are able to reap other health benefits from regular aerobic exercise training. Aging per se does not seem to be the culprit that will lead to metabolic diseases but rather it seems to be a lack of physical activity. -- Un entraînement sportif d'endurance, connu pour réduire le risque de développer des maladies métaboliques, augmente notre capacité aérobie. La capacité aérobie diminue généralement avec l'âge. Ce déclin est typiquement associé d'une diminution de la qualité du muscle squelettique. Au niveau moléculaire, cette diminution est due à des perturbations dans les mitochondries du muscle squelettique,, ce qui conduit à une diminution de la quantité totale des mitochondries présentes dans le muscle squelettique. Il n'a pas encore été établi si cette diminution de la teneur mitochondriale est due à un mode de vie sédentaire ou au processus du vieillissement. Ce travail de thèse montre un lien clair entre le contenu mitochondrial et la capacité aérobie. Il indique que des personnes âgées actives, avec des niveaux de V02max plus élevés, possèdent également un volume plus élevé de mitochondries dans leurs muscles en comparaison à leurs homologues sédentaires. En prenant des individus sédentaires et leur faisant pratiquer une activité physique aérobie, il est possible d'accroître leur contenu de même que leur fonction mitochondriale à des niveaux similaires à ceux d'athlètes du même âge ayant pratiqué une activité physique tout au long de leur vie. De plus, les résultats de ce travail démontrent que le contenu et la fonction mitochondriale sont en corrélation avec l'efficiscience lors d'exercice physique. En agumentant l'effiscience, les personnes sont alors capables de dépenser moins d'énergie pour une puissance d'exercice similaire. Donc, un volume mitochondrial accru dans le muscle squelettique, associé à une fonction mitochondriale améliorée, est associté à une augmentation de l'effiscience. En outre, les personnes qui augmentent leur effiscience, augmentent aussi leur capacité à oxyder les graisses durant l'exercice prolongé. Une augmentation du recours au graisses après l'intervention est associée à une quantité accrue de mitochondries, en particulier dans la région inter-myofibrillaire du muscle squelettique. Par conséquent, les personnes âgées autrefois sédentaires sont en mesure de récupérer leur contenu et leur fonction mitochondriale ainsi que d'autres avantages pour la santé grâce à un entraînement aérobie régulier. Le vieillissement en soi ne semble donc pas être le coupable conduisant aux maladies métaboliques qui semblent plutôt être lié à un manque d'activité physique.
Resumo:
The purpose of this research is to explore the variability on the soil thermal conductivity -λ- after a prescribe fire, and to assess the effects of the ashes on the heat transfer once it"s were incorporated into the soil matrix. Sampling plot was located in the Montgrí Massif (NE of Spain). A set of 42 soil samples between surface and 5 cm depth was collected before and after the fire. To characterize the soil chemical and physical variables were analyzed. To determine the vari-ability on the soil λ a dry-out curve per scenario (before and after fire) was determined. SoilRho® method based on ASTM D-5334-08 which was validated by LabFerrer was used. Soil thermal conductivity has shown changes in their values. Indeed, in all moisture scenarios the values of soil λ decreased after soil was burnt. The critical point in the rela-tionship ϴ (λ) for the soil after fire which always was stronger than soil before to be burnt. Soil with"white" ashes showed a high thermal conductivity. An X-Ray diffractometry analysis allowed to clarify and to verify these results. To sum up, we could say that thermal conductivity presents changes when the scenario changes, i.e. before and after to be burnt. On the other hand, the volume of ashes incorporated on the soil increased the differences between no burnt and burnt soil, showing even some improvements on the heat transfer when water content started to govern the process.
BioSuper: A web tool for the superimposition of biomolecules and assemblies with rotational symmetry
Resumo:
Background Most of the proteins in the Protein Data Bank (PDB) are oligomeric complexes consisting of two or more subunits that associate by rotational or helical symmetries. Despite the myriad of superimposition tools in the literature, we could not find any able to account for rotational symmetry and display the graphical results in the web browser. Results BioSuper is a free web server that superimposes and calculates the root mean square deviation (RMSD) of protein complexes displaying rotational symmetry. To the best of our knowledge, BioSuper is the first tool of its kind that provides immediate interactive visualization of the graphical results in the browser, biomolecule generator capabilities, different levels of atom selection, sequence-dependent and structure-based superimposition types, and is the only web tool that takes into account the equivalence of atoms in side chains displaying symmetry ambiguity. BioSuper uses ICM program functionality as a core for the superimpositions and displays the results as text, HTML tables and 3D interactive molecular objects that can be visualized in the browser or in Android and iOS platforms with a free plugin. Conclusions BioSuper is a fast and functional tool that allows for pairwise superimposition of proteins and assemblies displaying rotational symmetry. The web server was created after our own frustration when attempting to superimpose flexible oligomers. We strongly believe that its user-friendly and functional design will be of great interest for structural and computational biologists who need to superimpose oligomeric proteins (or any protein). BioSuper web server is freely available to all users at http://ablab.ucsd.edu/BioSuper webcite.
Resumo:
With the aim of monitoring the dynamics of the Livingston Island ice cap, the Departament de Geodinàmica i Geofísica of the Universitat de Barcelona began ye a r ly surveys in the austral summer of 1994-95 on Johnsons Glacier. During this field campaign 10 shallow ice cores were sampled with a manual ve rtical ice-core drilling machine. The objectives were: i) to detect the tephra layer accumulated on the glacier surface, attributed to the 1970 Deception Island pyroclastic eruption, today interstratified; ii) to verify wheter this layer might serve as a reference level; iii) to measure the 1 3 7Cs radio-isotope concentration accumulated in the 1965 snow stratum; iv) to use the isochrone layer as a mean of verifying the age of the 1970 tephra layer; and, v) to calculate both the equilibrium line of the glacier and average mass balance over the last 28 years (1965-1993). The stratigr a p hy of the cores, their cumulative density curves and the isothermal ice temperatures recorded confi rm that Johnsons Glacier is a temperate glacier. Wi n d, solar radiation heating and liquid water are the main agents controlling the ve rtical and horizontal redistribution of the volcanic and cryoclastic particles that are sedimented and remain interstratified within the g l a c i e r. It is because of this redistribution that the 1970 tephra layer does not always serve as a ve ry good reference level. The position of the equilibrium line altitude (ELA) in 1993, obtained by the 1 3 7Cs spectrometric analysis, varies from about 200 m a.s.l. to 250 m a.s.l. This indicates a rising trend in the equilibrium line altitude from the beginning of the 1970s to the present day. The va rying slope orientation of Johnsons Glacier relative to the prevailing NE wind gives rise to large local differences in snow accumulation, which locally modifies the equilibrium line altitude. In the cores studied, 1 3 7Cs appears to be associated with the 1970 tephra laye r. This indicates an intense ablation episode throughout the sampled area (at least up to 330 m a.s.l), which probably occurred synchronically to the 1970 tephra deposition or later. A rough estimate of the specific mass balance reveals a considerable accumulation gradient related to the increase with altitude.
Resumo:
In this study we used market settlement prices of European call options on stock index futures to extract implied probability distribution function (PDF). The method used produces a PDF of returns of an underlying asset at expiration date from implied volatility smile. With this method, the assumption of lognormal distribution (Black-Scholes model) is tested. The market view of the asset price dynamics can then be used for various purposes (hedging, speculation). We used the so called smoothing approach for implied PDF extraction presented by Shimko (1993). In our analysis we obtained implied volatility smiles from index futures markets (S&P 500 and DAX indices) and standardized them. The method introduced by Breeden and Litzenberger (1978) was then used on PDF extraction. The results show significant deviations from the assumption of lognormal returns for S&P500 options while DAX options mostly fit the lognormal distribution. A deviant subjective view of PDF can be used to form a strategy as discussed in the last section.
Resumo:
The purpose of this study was to increase the understanding of the role and nature of trust in asymmetric technology partnership formation. In the knowledge-based "learning race" knowledge is considered as a primary source for competitive advantage. In the emerging ICT sector the high pace of technological change, the convergence of technologies and industries as well as the increasing complexity and uncertainty have forced even the largest players to seek cooperation for complementary knowledge and capabilities. Small technology firms need the complementary resources and legitimacy of the large firms to grow and compete in the global market place. Most of the earlier research indicates, however, that partnerships with asymmetric size, managerial resources and cultures have failed. A basic assumption supported by earlier research was that trust is a critical factor in asymmetric technology partnership formation. Asymmetric technology partnership formation is a dynamic and multi-dimensional process, and consequently a holistic research approach was selected. Research issue was approached from different levels: the individual decision-maker, the firm and the relationship between the parties. Also the impact of the dynamic environment and technology content was analyzed. A multitheoretical approach and a qualitative research method with in-depth interviews in five large ICT companies and eight small ICT companies enabled a holistic and rich view of the research issue. Study contributes on the scarce understanding on the nature and evolution of trust in asymmetric technology partnership formation. It sheds also light on the specific nature of asymmetric technology partnerships. The partnerships were found to be tentative and the diverse strategic intent of small and large technology firms appeared as a major challenge. The role of the boundary spanner was highlighted as a possibility to match the incompatible organizational cultures. A shared vision was found to be a pre-condition for individual-based fast trust leading to intuitive decision-making and experimentation. The relationships were tentative and they were continuously re-evaluated through the key actors' sense making of the technology content, asymmetry and the dynamic environment. A multi-dimensional conceptualization for trust was created and propositions on the role and nature of trust for further research are given. The purpose of this study was to increase the understanding of the role and nature of trust in asymmetric technology partnership formation. In the knowledge-based "learning race" knowledge is considered as a primary source for competitive advantage. In the emerging ICT sector the high pace of technological change, the convergence of technologies and industries as well as the increasing complexity and uncertainty have forced even the largest players to seek cooperation for complementary knowledge and capabilities. Small technology firms need the complementary resources and legitimacy of the large firms to grow and compete in the global market place. Most of the earlier research indicates, however, that partnerships with asymmetric size, managerial resources and cultures have failed. A basic assumption supported by earlier research was that trust is a critical factor in asymmetric technology partnership formation. Asymmetric technology partnership formation is a dynamic and multi-dimensional process, and consequently a holistic research approach was selected. Research issue was approached from different levels: the individual decision-maker, the firm and the relationship between the parties. Also the impact of the dynamic environment and technology content was analyzed. A multitheoretical approach and a qualitative research method with in-depth interviews in five large ICT companies and eight small ICT companies enabled a holistic and rich view of the research issue. Study contributes on the scarce understanding on the nature and evolution of trust in asymmetric technology partnership formation. It sheds also light on the specific nature of asymmetric technology partnerships. The partnerships were found to be tentative and the diverse strategic intent of small and large technology firms appeared as a major challenge. The role of the boundary spanner was highlighted as a possibility to match the incompatible organizational cultures. A shared vision was found to be a pre-condition for individual-based fast trust leading to intuitive decision-making and experimentation. The relationships were tentative and they were continuously re-evaluated through the key actors' sense making of the technology content, asymmetry and the dynamic environment. A multi-dimensional conceptualization for trust was created and propositions on the role and nature of trust for further research are given.
Resumo:
Road safety has become an increasing concern in developed countries due to the significant amount of fatalities and the associated economic losses. Only in 2005 these losses rose to 200,000 million euros, a considerable sum ¿ approximately 2% of GDP ¿ that easily justifies any public intervention. One measure taken by governments to address this issue is to enact stricter policies and regulations. Since drunk driving is one of the greatest concerns among public authorities in this field, several European countries have lowered their illegal Blood Alcohol Content (BAC) levels to 0.5 mg/ml during the last decade. This study is the first evaluation of the effectiveness of this transition using European panel-based data (CARE) for the period 1991-2003 with the differences-in-differences method in a fixed effects estimation that allows for any pattern of correlation (Cluster-Robust). The results reveal a positive impact on certain groups of road users and on the whole population when the policy is accompanied by enforcement interventions. Moreover, positive results appeared after a time lag of over two years. Finally, I state the importance of controlling for serial correlation in the evaluation of this type of policy.
Resumo:
Road safety has become an increasing concern in developed countries due to the significant amount of fatalities and the associated economic losses. Only in 2005 these losses rose to 200,000 million euros, a considerable sum ¿ approximately 2% of GDP ¿ that easily justifies any public intervention. One measure taken by governments to address this issue is to enact stricter policies and regulations. Since drunk driving is one of the greatest concerns among public authorities in this field, several European countries have lowered their illegal Blood Alcohol Content (BAC) levels to 0.5 mg/ml during the last decade. This study is the first evaluation of the effectiveness of this transition using European panel-based data (CARE) for the period 1991-2003 with the differences-in-differences method in a fixed effects estimation that allows for any pattern of correlation (Cluster-Robust). The results reveal a positive impact on certain groups of road users and on the whole population when the policy is accompanied by enforcement interventions. Moreover, positive results appeared after a time lag of over two years. Finally, I state the importance of controlling for serial correlation in the evaluation of this type of policy.
Resumo:
Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.
Resumo:
PPPS: Problem: Public-private-partnerships in transport infrastructure characteristically increase user-fees. Purpose: We aim to identify the network effects of the use of PPPs and increased user tolls in road infrastructure. Methods: We study the increases in user tolls on motorways due to the use of PPPs in the US. Results and conclusions: Among other things, the monetization of motorways is associated with an increase in toll levels that has consequences for their users, and also for the rest of the sections of the network.
Resumo:
Network neutrality is a growing policy controversy. Traffic management techniques affect not only high-speed, high-money content, but by extension all other content too. Internet regulators and users may tolerate much more discrimination in the interests of innovation. For instance, in the absence of regulatory oversight, ISPs could use Deep Packet Inspection (DPI) to block some content altogether, if they decide it is not to the benefit of ISPs, copyright holders, parents or the government. ISP blocking is currently widespread in controlling spam email, and in some countries in blocking sexually graphic illegal images. In 1999 this led to scrutiny of foreclosure of Instant Messaging and video and cable-telephony horizontal merger. Fourteen years later, there were in 2013 net neutrality laws implemented in Slovenia, the Netherlands, Chile and Finland, regulation in the United States and Canada , co-regulation in Norway, and self-regulation in Japan, the United Kingdom and many other European countries . Both Germany and France in mid-2013 debated new net neutrality legislation, and the European Commission announced on 11 September 2013 that it would aim to introduce legislation in early 2014. This paper analyses these legal developments, and in particular the difficulty in assessing reasonable traffic management and ‘specialized’ (i.e. unregulated) faster services in both EU and US law. It also assesses net neutrality law against the international legal norms for user privacy and freedom of expression
Resumo:
There is increasing evidence regarding the role of chromosomal inversions in relevant biological processes such as local adaptation and speciation. A classic example of the adaptive role of chromosomal polymorphisms is given by the clines of inversion frequencies in Drosophila subobscura, repeatable across continents. Nevertheless, not much is known about the molecular variation associated with these polymorphisms. We characterized the genetic content of ca. 600 individuals from nine European populations following a latitudinal gradient by analysing 19 microsatellite loci from two autosomes (J and U) and the sex chromosome (A), taking into account their chromosomal inversions. Our results clearly demonstrate the molecular genetic uniformity within a given chromosomal inversion across a large latitudinal gradient, particularly from Groningen (Netherlands) in the north to Málaga (Spain) in the south, experiencing highly diverse environmental conditions. This low genetic differentiation within the same gene arrangement across the nine European populations is consistent with the local adaptation hypothesis for th evolutionof chromosomal polymorphisms. We also show the effective role of chromosomal inversions in maintaining different genetic pools within these inverted genomic regions even in the presence of high gene flow. Inversions represent thus an important barrier to gene flux and can help maintain specific allelic combinations with positive effects on fitness. Consistent patterns of microsatellite allele-inversion linkage disequilibrium particularly in loci within inversions were also observed. Finally, we identified areas within inversions presenting clinal variation that might be under selection.
Resumo:
OBJECTIVE: Since 2011, the new national final examination in human medicine has been implemented in Switzerland, with a structured clinical-practical part in the OSCE format. From the perspective of the national Working Group, the current article describes the essential steps in the development, implementation and evaluation of the Federal Licensing Examination Clinical Skills (FLE CS) as well as the applied quality assurance measures. Finally, central insights gained from the last years are presented. METHODS: Based on the principles of action research, the FLE CS is in a constant state of further development. On the foundation of systematically documented experiences from previous years, in the Working Group, unresolved questions are discussed and resulting solution approaches are substantiated (planning), implemented in the examination (implementation) and subsequently evaluated (reflection). The presented results are the product of this iterative procedure. RESULTS: The FLE CS is created by experts from all faculties and subject areas in a multistage process. The examination is administered in German and French on a decentralised basis and consists of twelve interdisciplinary stations per candidate. As important quality assurance measures, the national Review Board (content validation) and the meetings of the standardised patient trainers (standardisation) have proven worthwhile. The statistical analyses show good measurement reliability and support the construct validity of the examination. Among the central insights of the past years, it has been established that the consistent implementation of the principles of action research contributes to the successful further development of the examination. CONCLUSION: The centrally coordinated, collaborative-iterative process, incorporating experts from all faculties, makes a fundamental contribution to the quality of the FLE CS. The processes and insights presented here can be useful for others planning a similar undertaking.