799 resultados para Conceptualizing and Measuring


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this study, Evaluation the effect of Rosmarinus officinalis and Thymus vulgaris extracts on the stability of poly unsaturated fatty acids in frozen Silver carp minced. Treatments include: Treatment 1 - Control: frozen meat packaged in conventional Treatment 2: Frozen Silver carp minced+Thyme 300 mg/kg in normal packaging Treatment 3: Frozen Silver carp minced+Rosemary 200 mg/kg in normal packaging Treatment 4: Frozen Silver carp minced+Rosemary compound (100 mg/kg) and Thyme (100 mg/kg) in normal packaging After rapid freezing of samples in the spiral freezer by individual quick freezing method, to maintain the cold temperature (-18) °C were transferred. Sampling and measurements to determine the fatty acid profile of the zero phase beginning in the first month and then every ten days, and 15 days in the second month of the third month after the monthly test. Identifying, defining and measuring the fatty acid profile by gas chromatography was performed. In this study, levels of both saturated and unsaturated fatty acids in three experimental and one control were identified as follows: A: saturated fatty acids: Meristic C14: 0/Palmitic C16: 0/Hepta decaenoic C17: 0/Stearic C18: 0/Arashidic C20: 0/B:Mono unsaturated fatty acids: palmitoleic C16: 1-W7/Oleic C18: 1-W9/Gadoleic C20: 1-W9 C:Poly unsaturated fatty acids: Linoleic C18: 2-W6/α-Linolenic C18: 3-W3 D:High unsaturated fatty acids: Arachidonic C20: 4-W6 Eicosapentaenoic acid C20: 5-EPA/W3 Docosahexaenoic C22: 6-DHA/W3 Results of this study was to determine, Thyme and rosemary extracts containing silver carp minced stored in freezing conditions, Stability of different types of fatty acids, monounsaturated fatty acids, poly-unsaturated fatty acids, omega-3 and omega-6 fatty acids are. So that none of the fatty acids measured were not significant 100% increase or decrease, While changes in the fatty acid oxidation during storage time is minimized. The results obtained from the fatty acid profiles and indicators of their and statistical tests show that treatment with rosemary extract More stable during storage (-18) ° C In comparison with the control and other treatments are shown; And at relatively low compared to other treatments and control samples oleic acid and linoleic acid, palmitic more. According to studies,in Silver carp minced that containing rosemary extract, end of the storage period of six months. Were usable, so even rosemary extract the shelf-life examples to increase more than six months.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The cleavage of adenosine-5'-monophosphate (5'-AMP) and guanosine-5'-monophosphate (S-GMP) by Ce4+ and lanthanide complex of 2-carboxyethylgermanium sesquioxide (Ge-132) in acidic and near neutral conditions was investigated by NMR, HPLC and measuring the liberated inorganic phosphate at 37 degrees C and 50 degrees C, The results showed that 5'-GMP and 5'-AMP was converted to guanine (G), 5'-monophosphate (depurination of 5'-GMP), ribose (depurination and dephosphorylation of 5'-GMP), phosphate and adenine (A), 5'-monophosphate (depurination of 5'-AMP), ribose (depurination and dephosphorylation of 5'-AMP), phosphate respectively by Ce4+. In presence of lanthanide complexes, 5'-GMP and 5'-AMP were converted to guanosine (Guo) and phosphate and adenosine (Ado) and phosphate respectively. The mechanism of cleaving 5'-GMP and 5'-AMP is hydrolytic scission.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Accurate measurement of network bandwidth is crucial for flexible Internet applications and protocols which actively manage and dynamically adapt to changing utilization of network resources. These applications must do so to perform tasks such as distributing and delivering high-bandwidth media, scheduling service requests and performing admission control. Extensive work has focused on two approaches to measuring bandwidth: measuring it hop-by-hop, and measuring it end-to-end along a path. Unfortunately, best-practice techniques for the former are inefficient and techniques for the latter are only able to observe bottlenecks visible at end-to-end scope. In this paper, we develop and simulate end-to-end probing methods which can measure bottleneck bandwidth along arbitrary, targeted subpaths of a path in the network, including subpaths shared by a set of flows. As another important contribution, we describe a number of practical applications which we foresee as standing to benefit from solutions to this problem, especially in emerging, flexible network architectures such as overlay networks, ad-hoc networks, peer-to-peer architectures and massively accessed content servers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We examined the coherence of trauma memories in a trauma-exposed community sample of 30 adults with and 30 without posttraumatic stress disorder. The groups had similar categories of traumas and were matched on multiple factors that could affect the coherence of memories. We compared the transcribed oral trauma memories of participants with their most important and most positive memories. A comprehensive set of 28 measures of coherence including 3 ratings by the participants, 7 ratings by outside raters, and 18 computer-scored measures, provided a variety of approaches to defining and measuring coherence. A multivariate analysis of variance indicated differences in coherence among the trauma, important, and positive memories, but not between the diagnostic groups or their interaction with these memory types. Most differences were small in magnitude; in some cases, the trauma memories were more, rather than less, coherent than the control memories. Where differences existed, the results agreed with the existing literature, suggesting that factors other than the incoherence of trauma memories are most likely to be central to the maintenance of posttraumatic stress disorder and thus its treatment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Coastal systems, such as rocky shores, are among the most heavily anthropogenically-impacted marine ecosystems and are also among the most productive in terms of ecosystem functioning. One of the greatest impacts on coastal ecosystems is nutrient enrichment from human activities such as agricultural run-off and discharge of sewage. The aim of this study was to identify and characterise potential effects of sewage discharges on the biotic diversity of rocky shores and to test current tools for assessing the ecological status of rocky shores in line with the EU Water Framework Directive (WFD). A sampling strategy was designed to test for effects of sewage outfalls on rocky shore assemblages on the east coast of Ireland and to identify the scale of the putative impact. In addition, a separate sampling programme based on the Reduced algal Species List (RSL), the current WFD monitoring tool for rocky shores in Ireland and the UK, was also completed by identifying algae and measuring percent cover in replicate samples on rocky shores during Summer. There was no detectable effect of sewage outfalls on benthic taxon diversity or assemblage structure. However, spatial variability of assemblages was greater at sites proximal or adjacent to sewage outfalls compared to shores without sewage outfalls present. Results based on the RSL, show that algal assemblages were not affected by the presence of sewage outfalls, except when classed into functional groups when variability was greater at the sites with sewage outfalls. A key finding of both surveys, was the prevalence of spatial and temporal variation of assemblages. It is recommended that future metrics of ecological status are based on quantified sampling designs, incorporate changes in variability of assemblages (indicative of community stability), consider shifts in assemblage structure and include both benthic fauna and flora to assess the status of rocky shores.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

To offer insight into how cognitive–behavioural therapy (CBT) competence is defined, measured and evaluated and to highlight ways in which the assessment of CBT competence could be further improved, the current study utilizes a qualitative methodology to examine CBT experts’ (N = 19) experiences of conceptualizing and assessing the competence of CBT therapists. Semi-structured interviews were used to explore participants’ experiences of assessing the competence of CBT therapists. Interview transcripts were then analysed using interpretative phenomenological analysis in order to identify commonalities and differences in the way CBT competence is evaluated. Four superordinate themes were identified: (i) what to assess, the complex and fuzzy concept of CBT competence; (ii) how to assess CBT competence, selecting from the toolbox of assessment methods; (iii) who is best placed to assess CBT competence, expertise and independence; and (iv) pitfalls, identifying and overcoming assessment biases. Priorities for future research and ways in which the assessment of CBT competence could be further improved are discussed in light of these findings.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Transport is an essential sector in modern societies. It connects economic sectors and industries. Next to its contribution to economic development and social interconnection, it also causes adverse impacts on the environment and results in health hazards. Transport is a major source of ground air pollution, especially in urban areas, and therefore contributing to the health problems, such as cardiovascular and respiratory diseases, cancer, and physical injuries. This thesis presents the results of a health risk assessment that quantifies the mortality and the diseases associated with particulate matter pollution resulting from urban road transport in Hai Phong City, Vietnam. The focus is on the integration of modelling and GIS approaches in the exposure analysis to increase the accuracy of the assessment and to produce timely and consistent assessment results. The modelling was done to estimate traffic conditions and concentrations of particulate matters based on geo-references data. A simplified health risk assessment was also done for Ha Noi based on monitoring data that allows a comparison of the results between the two cases. The results of the case studies show that health risk assessment based on modelling data can provide a much more detail results and allows assessing health impacts of different mobility development options at micro level. The use of modeling and GIS as a common platform for the integration of different assessments (environmental, health, socio-economic, etc.) provides various strengths, especially in capitalising on the available data stored in different units and forms and allows handling large amount of data. The use of models and GIS in a health risk assessment, from a decision making point of view, can reduce the processing/waiting time while providing a view at different scales: from micro scale (sections of a city) to a macro scale. It also helps visualising the links between air quality and health outcomes which is useful discussing different development options. However, a number of improvements can be made to further advance the integration. An improved integration programme of the data will facilitate the application of integrated models in policy-making. Data on mobility survey, environmental monitoring and measuring must be standardised and legalised. Various traffic models, together with emission and dispersion models, should be tested and more attention should be given to their uncertainty and sensitivity

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Liiketoiminnan organisoiminen projekteiksi on erittäin yleistä nykyisin. Suuri osa projekteista erityisesti IT-alalla epäonnistuu kuitenkin saavuttamaan tavoitteensa. Projektin menestys on tyypillisesti mitattu budjetin, aikataulun, laadun ja sidosryhmien tyytyväisyyden perusteella. Tämän Pro Gradu -tutkielman tarkoituksena on etsiä tyypillisimpiä syitä projektien epäonnistumiseen ja löytää projektien seurannan ja mittaamisen avulla keinoja näiden epäonnistumisten ehkäisemiseen. Tutkimusmenetelmänä on laadullinen tapaustutkimus. Empiirinen aineisto on kerätty haastattelujen, eri materiaalien analysoinnin ja havainnoinnin avulla. Teoriaosuus tarjoaa kattavan yhteenvedon projektiliiketoiminnan ja yksittäisten projektien johtamiseen sekä projektien seurantaan ja mittaamiseen aikaisemman kirjallisuuden perusteella. Empiirisessä osiossa suoritetaan analyysi Case -yrityksen projektien seurantaan ja valittuihin projekteihin. Analyysien, haastattelujen ja havainnoinnin pohjalta tehdään johtopäätökset tyypillisimmistä, ongelmia projekteissa aiheuttavista tekijöistä sekä näiden esiintymisestä projektin elinkaaren eri vaiheissa. Mahdolliset ongelmia ehkäisevät keinot esitetään myös. Ehdotuksia kehityskohteiksi esitetään lopuksi teorian ja empirian pohjalta.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Avec les avancements de la technologie de l'information, les données temporelles économiques et financières sont de plus en plus disponibles. Par contre, si les techniques standard de l'analyse des séries temporelles sont utilisées, une grande quantité d'information est accompagnée du problème de dimensionnalité. Puisque la majorité des séries d'intérêt sont hautement corrélées, leur dimension peut être réduite en utilisant l'analyse factorielle. Cette technique est de plus en plus populaire en sciences économiques depuis les années 90. Étant donnée la disponibilité des données et des avancements computationnels, plusieurs nouvelles questions se posent. Quels sont les effets et la transmission des chocs structurels dans un environnement riche en données? Est-ce que l'information contenue dans un grand ensemble d'indicateurs économiques peut aider à mieux identifier les chocs de politique monétaire, à l'égard des problèmes rencontrés dans les applications utilisant des modèles standards? Peut-on identifier les chocs financiers et mesurer leurs effets sur l'économie réelle? Peut-on améliorer la méthode factorielle existante et y incorporer une autre technique de réduction de dimension comme l'analyse VARMA? Est-ce que cela produit de meilleures prévisions des grands agrégats macroéconomiques et aide au niveau de l'analyse par fonctions de réponse impulsionnelles? Finalement, est-ce qu'on peut appliquer l'analyse factorielle au niveau des paramètres aléatoires? Par exemple, est-ce qu'il existe seulement un petit nombre de sources de l'instabilité temporelle des coefficients dans les modèles macroéconomiques empiriques? Ma thèse, en utilisant l'analyse factorielle structurelle et la modélisation VARMA, répond à ces questions à travers cinq articles. Les deux premiers chapitres étudient les effets des chocs monétaire et financier dans un environnement riche en données. Le troisième article propose une nouvelle méthode en combinant les modèles à facteurs et VARMA. Cette approche est appliquée dans le quatrième article pour mesurer les effets des chocs de crédit au Canada. La contribution du dernier chapitre est d'imposer la structure à facteurs sur les paramètres variant dans le temps et de montrer qu'il existe un petit nombre de sources de cette instabilité. Le premier article analyse la transmission de la politique monétaire au Canada en utilisant le modèle vectoriel autorégressif augmenté par facteurs (FAVAR). Les études antérieures basées sur les modèles VAR ont trouvé plusieurs anomalies empiriques suite à un choc de la politique monétaire. Nous estimons le modèle FAVAR en utilisant un grand nombre de séries macroéconomiques mensuelles et trimestrielles. Nous trouvons que l'information contenue dans les facteurs est importante pour bien identifier la transmission de la politique monétaire et elle aide à corriger les anomalies empiriques standards. Finalement, le cadre d'analyse FAVAR permet d'obtenir les fonctions de réponse impulsionnelles pour tous les indicateurs dans l'ensemble de données, produisant ainsi l'analyse la plus complète à ce jour des effets de la politique monétaire au Canada. Motivée par la dernière crise économique, la recherche sur le rôle du secteur financier a repris de l'importance. Dans le deuxième article nous examinons les effets et la propagation des chocs de crédit sur l'économie réelle en utilisant un grand ensemble d'indicateurs économiques et financiers dans le cadre d'un modèle à facteurs structurel. Nous trouvons qu'un choc de crédit augmente immédiatement les diffusions de crédit (credit spreads), diminue la valeur des bons de Trésor et cause une récession. Ces chocs ont un effet important sur des mesures d'activité réelle, indices de prix, indicateurs avancés et financiers. Contrairement aux autres études, notre procédure d'identification du choc structurel ne requiert pas de restrictions temporelles entre facteurs financiers et macroéconomiques. De plus, elle donne une interprétation des facteurs sans restreindre l'estimation de ceux-ci. Dans le troisième article nous étudions la relation entre les représentations VARMA et factorielle des processus vectoriels stochastiques, et proposons une nouvelle classe de modèles VARMA augmentés par facteurs (FAVARMA). Notre point de départ est de constater qu'en général les séries multivariées et facteurs associés ne peuvent simultanément suivre un processus VAR d'ordre fini. Nous montrons que le processus dynamique des facteurs, extraits comme combinaison linéaire des variables observées, est en général un VARMA et non pas un VAR comme c'est supposé ailleurs dans la littérature. Deuxièmement, nous montrons que même si les facteurs suivent un VAR d'ordre fini, cela implique une représentation VARMA pour les séries observées. Alors, nous proposons le cadre d'analyse FAVARMA combinant ces deux méthodes de réduction du nombre de paramètres. Le modèle est appliqué dans deux exercices de prévision en utilisant des données américaines et canadiennes de Boivin, Giannoni et Stevanovic (2010, 2009) respectivement. Les résultats montrent que la partie VARMA aide à mieux prévoir les importants agrégats macroéconomiques relativement aux modèles standards. Finalement, nous estimons les effets de choc monétaire en utilisant les données et le schéma d'identification de Bernanke, Boivin et Eliasz (2005). Notre modèle FAVARMA(2,1) avec six facteurs donne les résultats cohérents et précis des effets et de la transmission monétaire aux États-Unis. Contrairement au modèle FAVAR employé dans l'étude ultérieure où 510 coefficients VAR devaient être estimés, nous produisons les résultats semblables avec seulement 84 paramètres du processus dynamique des facteurs. L'objectif du quatrième article est d'identifier et mesurer les effets des chocs de crédit au Canada dans un environnement riche en données et en utilisant le modèle FAVARMA structurel. Dans le cadre théorique de l'accélérateur financier développé par Bernanke, Gertler et Gilchrist (1999), nous approximons la prime de financement extérieur par les credit spreads. D'un côté, nous trouvons qu'une augmentation non-anticipée de la prime de financement extérieur aux États-Unis génère une récession significative et persistante au Canada, accompagnée d'une hausse immédiate des credit spreads et taux d'intérêt canadiens. La composante commune semble capturer les dimensions importantes des fluctuations cycliques de l'économie canadienne. L'analyse par décomposition de la variance révèle que ce choc de crédit a un effet important sur différents secteurs d'activité réelle, indices de prix, indicateurs avancés et credit spreads. De l'autre côté, une hausse inattendue de la prime canadienne de financement extérieur ne cause pas d'effet significatif au Canada. Nous montrons que les effets des chocs de crédit au Canada sont essentiellement causés par les conditions globales, approximées ici par le marché américain. Finalement, étant donnée la procédure d'identification des chocs structurels, nous trouvons des facteurs interprétables économiquement. Le comportement des agents et de l'environnement économiques peut varier à travers le temps (ex. changements de stratégies de la politique monétaire, volatilité de chocs) induisant de l'instabilité des paramètres dans les modèles en forme réduite. Les modèles à paramètres variant dans le temps (TVP) standards supposent traditionnellement les processus stochastiques indépendants pour tous les TVPs. Dans cet article nous montrons que le nombre de sources de variabilité temporelle des coefficients est probablement très petit, et nous produisons la première évidence empirique connue dans les modèles macroéconomiques empiriques. L'approche Factor-TVP, proposée dans Stevanovic (2010), est appliquée dans le cadre d'un modèle VAR standard avec coefficients aléatoires (TVP-VAR). Nous trouvons qu'un seul facteur explique la majorité de la variabilité des coefficients VAR, tandis que les paramètres de la volatilité des chocs varient d'une façon indépendante. Le facteur commun est positivement corrélé avec le taux de chômage. La même analyse est faite avec les données incluant la récente crise financière. La procédure suggère maintenant deux facteurs et le comportement des coefficients présente un changement important depuis 2007. Finalement, la méthode est appliquée à un modèle TVP-FAVAR. Nous trouvons que seulement 5 facteurs dynamiques gouvernent l'instabilité temporelle dans presque 700 coefficients.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Preparation of an appropriate optical-fiber preform is vital for the fabrication of graded-index polymer optical fibers (GIPOF), which are considered to be a good choice for providing inexpensive high bandwidth data links, for local area networks and telecommunication applications. Recent development of the interfacial gel polymerization technique has caused a dramatic reduction in the total attenuation in GIPOF, and this is one of the potential methods to prepare fiber preforms for the fabrication of dye-doped polymer-fiber amplifiers. In this paper, the preparation of a dye-doped graded-index poly(methyl methacrylate) (PMMA) rod by the interfacial gel polymerization method using a PMMA tube is reported. An organic compound of high-refractive index, viz., diphenyl phthalate (DPP), was used to obtain a graded-index distribution, and Rhodamine B (Rh B), was used to dope the PMMA rod. The refractive index profile of the rod was measured using an interferometric technique and the index exponent was estimated. The single pass gain of the rod was measured at a pump wavelength of 532 nm. The extent of doping of the Rh B in the preform was studied by axially exciting a thin slice of the rod with white light and measuring the spatial variation of the fluorescence intensity across the sample.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present our recent achievements in the growing and optical characterization of KYb(WO4)2 (hereafter KYbW) crystals and demonstrate laser operation in this stoichiometric material. Single crystals of KYbW with optimal crystalline quality have been grown by the top-seeded-solution growth slow-cooling method. The optical anisotropy of this monoclinic crystal has been characterized, locating the tensor of the optical indicatrix and measuring the dispersion of the principal values of the refractive indices as well as the thermo-optic coefficients. Sellmeier equations have been constructed valid in the visible and near-IR spectral range. Raman scattering has been used to determine the phonon energies of KYbW and a simple physical model is applied for classification of the lattice vibration modes. Spectroscopic studies (absorption and emission measurements at room and low temperature) have been carried out in the spectral region near 1 µm characteristic for the ytterbium transition. Energy positions of the Stark sublevels of the ground and the excited state manifolds have been determined and the vibronic substructure has been identified. The intrinsic lifetime of the upper laser level has been measured taking care to suppress the effect of reabsorption and the intrinsic quantum efficiency has been estimated. Lasing has been demonstrated near 1074 nm with 41% slope efficiency at room temperature using a 0.5 mm thin plate of KYbW. This laser material holds great promise for diode pumped high-power lasers, thin disk and waveguide designs as well as for ultrashort (ps/fs) pulse laser systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper studies the effect of strengthening democracy, as captured by an increase in voting rights, on the incidence of violent civil conflict in nineteenth-century Colombia. Empirically studying the relationship between democracy and conflict is challenging, not only because of conceptual problems in defining and measuring democracy, but also because political institutions and violence are jointly determined. We take advantage of an experiment of history to examine the impact of one simple, measurable dimension of democracy (the size of the franchise) on con- flict, while at the same time attempting to overcome the identification problem. In 1853, Colombia established universal male suffrage. Using a simple difference-indifferences specification at the municipal level, we find that municipalities where more voters were enfranchised relative to their population experienced fewer violent political battles while the reform was in effect. The results are robust to including a number of additional controls. Moreover, we investigate the potential mechanisms driving the results. In particular, we look at which components of the proportion of new voters in 1853 explain the results, and we examine if results are stronger in places with more political competition and state capacity. We interpret our findings as suggesting that violence in nineteenth-century Colombia was a technology for political elites to compete for the rents from power, and that democracy constituted an alternative way to compete which substituted violence.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With both climate change and air quality on political and social agendas from local to global scale, the links between these hitherto separate fields are becoming more apparent. Black carbon, largely from combustion processes, scatters and absorbs incoming solar radiation, contributes to poor air quality and induces respiratory and cardiovascular problems. Uncertainties in the amount, location, size and shape of atmospheric black carbon cause large uncertainty in both climate change estimates and toxicology studies alike. Increased research has led to new effects and areas of uncertainty being uncovered. Here we draw together recent results and explore the increasing opportunities for synergistic research that will lead to improved confidence in the impact of black carbon on climate change, air quality and human health. Topics of mutual interest include better information on spatial distribution, size, mixing state and measuring and monitoring. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Various methods of assessment have been applied to the One Dimensional Time to Explosion (ODTX) apparatus and experiments with the aim of allowing an estimate of the comparative violence of the explosion event to be made. Non-mechanical methods used were a simple visual inspection, measuring the increase in the void volume of the anvils following an explosion and measuring the velocity of the sound produced by the explosion over 1 metre. Mechanical methods used included monitoring piezo-electric devices inserted in the frame of the machine and measuring the rotational velocity of a rotating bar placed on the top of the anvils after it had been displaced by the shock wave. This last method, which resembles original Hopkinson Bar experiments, seemed the easiest to apply and analyse, giving relative rankings of violence and the possibility of the calculation of a “detonation” pressure.